How FAST is the RTX 4090 for 3D Animation + Rendering??

Поділитися
Вставка

КОМЕНТАРІ • 761

  • @SirWade
    @SirWade  2 роки тому +96

    What do you think of the results so far?? Would you get use out of these speeds? And anything you'd like to see me cover next time? :)

    • @NightVisionOfficial
      @NightVisionOfficial 2 роки тому +2

      Well ... i want to do more complex sculpting on Blender, and learn UE5 faster, since with my GPU is too slow to get any patience of learning it :/. So... i guess it's a yes. Also, i was looking into Substance Painter, but i have many materials in Blender, so, Baking textures quicker would be good!

    • @SSPBradley11
      @SSPBradley11 2 роки тому +4

      If you complete the test again I'd be interested in seeing how it handles particles for special effects.

    • @rVox-Dei
      @rVox-Dei 2 роки тому

      I would love to see a review on how far the HiP backend has come, can't wait until radeon 7000 becuase nvidia does have a monopoly in blender rn

    • @adnanerochdi6982
      @adnanerochdi6982 2 роки тому +4

      Of course this is a giant leap in perf gen-to-gen, and it is worth it for someone who needs to double their speeds right away. Thank you Wade sm for the video by the way. But the value proposition that rendering farms provide makes one rethink before making the purchase especially if u have to upgrade the rest of ur gear like PSUs and MoBo. This is the main points for me, especially when a 3090 still provides one of the fastest viewport and rendering speeds.
      On a side note, and I think this is so important, 4090 owners can't double their VRAM with a second GPU like 3090/ti, which is worth considering if you're already stepping on the limits of 24GBs of video ram. And this alone lets me wonder, probably we can expect something next year that's gonna have NVLINK and slightly faster speeds. It's really exciting that now we can take super fast rendering for granted and how cheap it has become. Thanks again Wade, it was an exciting and a complete video.

    • @garrygiomarelli3476
      @garrygiomarelli3476 2 роки тому +6

      would be interesting to see 4090 vs multiple 3090s combined in the same machine. and some more info on power consumption comparisons when working on big projects.

  • @johntnguyen1976
    @johntnguyen1976 2 роки тому +336

    Probably the most useful of the huge influx of 4090 videos happening right now...cuz all we ever get are stats, and catering to the gamers. Thanks for putting one out for the creatives!

    • @Oscar4u69
      @Oscar4u69 2 роки тому +43

      I got bored of all the reviews talking only about games, and that's just a waste of GPU, games don't need that much power, the real use for a GPU is in things like this

    • @ishiddddd4783
      @ishiddddd4783 2 роки тому +24

      @@Oscar4u69 if they are to play at 4k 100fps + it's not really a waste, especially since atm it's the only gpu that can do so natively with modern titles

    • @Anti-FreedomD.P.R.ofSouthKorea
      @Anti-FreedomD.P.R.ofSouthKorea 2 роки тому +5

      @@ishiddddd4783 but there are not many great games to play that suit this performance right now. I would rather play gmod than to play any of the stuffs made for 11, 12 yr olds that's being marketed "4K max settings" currently

    • @ishiddddd4783
      @ishiddddd4783 2 роки тому

      @@Anti-FreedomD.P.R.ofSouthKorea k, but that's you and gmod runs in 4k with almost a decade old hardware

    • @lavart7043
      @lavart7043 2 роки тому

      How is it usefel while no Radeon GPUs are included

  • @JetCooper3D
    @JetCooper3D 2 роки тому +65

    We work on Disney and Marvel films at Pinewood Studios. After doing similar tests, we changed our RTX 3090 cards over to 4090's. We stopped buying Quadro cards years ago.
    Great video - subscribed - thank you!

    • @nyahbinghiman5984
      @nyahbinghiman5984 Рік тому +1

      Why you stopped buying Quadros?

    • @ahmetkocoval1375
      @ahmetkocoval1375 Рік тому +2

      ​@@nyahbinghiman5984muchhhhhhhh dollars 😂

    • @ichisenzy
      @ichisenzy Рік тому +8

      tell your boss to make actual good movies

    • @insertname7458
      @insertname7458 Рік тому

      good job g, i dont usually watch films or shi but i appreciate yall being able to make all that realistic even like 10 20 years ago

  • @雪鷹魚英語培訓的領航
    @雪鷹魚英語培訓的領航 2 роки тому +309

    Really cool that you got access to these cards like Digital Foundry / Gamers Nexus / et cetera. Those guys aren't focusing on the artist tools like you are, so it's really nice to see that aspect explored here! Definitely interested in Unreal / Houdini / Davinci/Fusion.

    • @n00buo
      @n00buo 2 роки тому +5

      nvidia is desperate to sell this scam card, they'll send cards to anyone for a few bucks so they can lie to people

    • @RazielXT
      @RazielXT 2 роки тому +34

      @@n00buo 4080 12gb is scam card, 4090 is beast

    • @n00buo
      @n00buo 2 роки тому

      @@RazielXT 3080 is the best card so far, 4000 series can't compete with Ampere they're just heaters for tards with money

    • @user9267
      @user9267 2 роки тому +13

      @@n00buo
      4090 seems to be a pretty decent deal

    • @n00buo
      @n00buo 2 роки тому +2

      @@user9267 HAHHAHA nvidia got bots for youtube comments, they know.

  • @mjlagrone
    @mjlagrone 2 роки тому +95

    Yes please for the Part 2. I would especially like see how it compares in Blender when you have a lot of hair and subsurface scattering! And maybe also with a giant pile of grass and other vegetation.

    • @SnakeTheCowboy
      @SnakeTheCowboy 2 роки тому +4

      All hands up for more Blender testing!

  • @KillahMate
    @KillahMate 2 роки тому +125

    Note: if you're using Cycles (as opposed to Eevee) it's *all* raytraced. It's a path tracing renderer which means that every sample for every pixel has been raytraced, and has therefore gone through the RTX hardware pipeline - the only difference with reflective surfaces is how coherent the rays are. To test non-raytracing performance you'd need to use Eevee.

    • @PoollShietz
      @PoollShietz 2 роки тому +6

      Note: scenes layer is awesome you can render cycles and eevee together

    • @Pixel_FX
      @Pixel_FX 2 роки тому +2

      Ray tracing and path tracing are two different things.

    • @KillahMate
      @KillahMate 2 роки тому +8

      @@Pixel_FX They are two related things - one is a subset of the other. The important bit is that if you have an RTX GPU and Cycles is configured to make use of it via OptiX for hardware acceleration, then the Cycles path samples are being calculated on the GPU's RT cores. And since everything Cycles does is path samples, then _everything_ is rendered with RT cores.
      This is unlike most video games, which must run in real time and therefore only use RT when they have to, like for reflections and such, and never use RT to do path tracing because it's still too demanding and slow for real time.

    • @SirWade
      @SirWade  2 роки тому +13

      I misspoke - I was talking about the shaders not being reflection / refraction-heavy in that scene. The scene didn't require much complex calculation compared to something like the Maya render later in the video

    • @BeheadedKamikaze
      @BeheadedKamikaze 2 роки тому +14

      @@SirWade Diffuse lighting is *more* complex to calculate than specular reflections. As @KillahMate is trying to explain, this is how path tracing works - a diffuse surface is really just a crap-ton of reflections, all from different directions, and the colour is averaged over hundreds of samples until it becomes smooth. Whereas a specular shader reflects all the rays in more or less the same angle so it turns into a clean result much more quickly. You are getting confused with game rendering terminology. Path tracing is *all* reflections. 100%. It doesn't matter how many specular surfaces there are. And every single one of those rays is calculated using the RT cores.

  • @fxadam
    @fxadam 2 роки тому +72

    Great video. Picked up the RTX 4090 today and it is incredible at hardware rendering in Arnold, Blender, Keyshot etc. Games are fun but this GPU is excellent for content creation.

    • @SW-fh7he
      @SW-fh7he 2 роки тому +2

      How did you get it?

    • @fxadam
      @fxadam 2 роки тому +2

      @@SW-fh7he Walked into Microcenter on launch day. They had plenty. They're sold out now but they should have more shortly. Apparently Nvidia is sending out links to Geforce Experience users that will allow them to easily order a 4090 from Best Buy without having to deal with the bots that are slamming best buy right now.

    • @checkmymovie
      @checkmymovie 2 роки тому +1

      I gonna pick up mine today and build a whole new computer for Daz Studio because I'm character designer.

    • @zubairalam2795
      @zubairalam2795 Рік тому

      Also the psu... :/ and the wattage is hugee!!!

    • @Citizen1482608
      @Citizen1482608 2 місяці тому

      No 4090 will live as long as a quadro (RTX A6000) for example under stressful work loads daily rendering as I use them. One of the main reasons why I was afraid to buy one.

  • @techdraconis
    @techdraconis 2 роки тому +92

    I would love to see a part 2 with unreal and houdini.

  • @thevoid6756
    @thevoid6756 2 роки тому +8

    The "Why Vram Matters" chapter is like the hidden gem of this video. Glad Paul recommended your channel.

  • @LiyoungMartin
    @LiyoungMartin 2 роки тому +42

    Finally, an in-depth analysis of 4090 performance for 3d workflow!!! Could you pretty pleeeease (as you mentioned earlier in the video) do a separate video for unreal engine? Thanks!

    • @flyinggecko3322
      @flyinggecko3322 2 роки тому +3

      Yes, Unreal and even further look into blender, like different types viewport settings and final renders at 4K would be amazing!

    • @zubairalam2795
      @zubairalam2795 Рік тому

      Thats what im looking for as well.. thnxx mate

  • @gcharb2d
    @gcharb2d 2 роки тому +23

    That's why I got the 12 GB RTX 3060 instead of the 8 GB RTX 3070, a tad slower, but cheaper, and it handles larger scenes!
    Great video!

    • @MIchaelSybi
      @MIchaelSybi Рік тому +3

      I got gtx 680 with 4 gb instead of 2, and it served me some more years than it would otherwise, as many programs had 4gb as a bare minimum with time.

    • @Musaibavr
      @Musaibavr 2 місяці тому

      I bought RTX3060 12GB instead of RTX4060 8GB for 3Ds Max.

  • @feloi3033
    @feloi3033 Рік тому +40

    only this guy can say, "mom I need 4090 for homework".

    • @cryogenicheart2019
      @cryogenicheart2019 7 місяців тому +2

      Quickest way to get your parents to take you out of animation school and put you in a real university

    • @trungtechreview627
      @trungtechreview627 4 місяці тому

      @@cryogenicheart2019after graduating from an animation university, I found more interests in tech and pc, not animation :)

  • @DonC876
    @DonC876 2 роки тому +11

    I think another angle of looking at this is efficiency. If you have your computer compute a lot then this will also add to the power bill fast. Just yesterday i saw a review where they tried to find the best efficiency by slightly underclocking (about 150mhz) and undervolting and they got the power consumption down almost to 3090 levels (roughly from 400W to 300W) with only a few percent of performance lost. That would mean that you basically double your efficiency in that case. So there's another cost factor that can make that investment worthwhile even quicker.

  • @user-ly1en7kl2o
    @user-ly1en7kl2o 2 роки тому +9

    Great you're back, can't wait to see more of your animations.

  • @3dduff
    @3dduff 2 роки тому +5

    yes please make a part 2. I am a freelance who does use Maya/c4d/Houdini, and render mostly with Redshift. I have my own 5 system minifarm stocked with 30xx GPUS. But a big part of my time is spent in simulations. Only a few situation excist that can speed up simulations with GPUs, but I would very much like to see some of these run on a 4090.
    Great video as always, keep up the good work.

  • @pmAdministrator
    @pmAdministrator Рік тому +2

    You're absolutely right! Thank you for the video. For us, who use these cards for work, these cards are INSANE!

  • @hanselespinosa8918
    @hanselespinosa8918 2 роки тому +6

    As an artist working with a 1080 Ti in the current year. I don't even fully grasp the amount of creative decisions I could make with this card if I was able to afford it. I agree with the statement in regards to the gaming community. The conversation about the 40 series ends the same way it does every year, more frames equals better performance better gaming experience. That's it. For the creative community it means time budgets can be allocated differently. When you mentioned the difference of 9 hours, man, 9 hours for sound design or post processing in general can make a huge difference.
    Really good review. First time I check out the channel, thanks for sharing.

  • @rcarter1690
    @rcarter1690 2 роки тому +16

    Finally some real world tests that really show why an animator would spend so much on a card like this. That 3 minute short test is the best I’ve seen that no other UA-camrs seem to understand. Thank you!

  • @mixtapechi
    @mixtapechi 2 роки тому +30

    It's good to see someone making benchmarks on creative programs rather than games. Thanks!

  • @Bunderwahl
    @Bunderwahl Рік тому +4

    Part 2 please, would be really cool if you could include 4080 and 7900 XTX, and even better if you could add video production and other creative applications!

  • @hardwire666too
    @hardwire666too 2 роки тому +4

    I am so glad you talked about an actual animation rendering benchmark. I have gotten in countless arguments with people about how a single still frame tells me nothing about how a video card will perform for my needs as an artist. All that tells me is how well that video card will render that one single frame with the most optimal settings. It tells me basically nothing. People just DON'T understand that the settings used for one frame migh not be great for the next frame. So one frame might render in 30sec, but the next might render in a 1min30sec, and the frame after that might take 5min. So that single-frame bench mark is utterly useless. So thank you. I fell vindicated. lol.
    Also on the note of renderfarms Blender has a fairly popular one called sheepit where you can use your own hardware to earn time on the render farm for youru own projects.

    • @carlesv7219
      @carlesv7219 2 роки тому

      Imagine when you try to tell people that rendering scenes has nothing to do with actual viewport performance while you're working creating that scene, but people loves to see simplifications and numbers just to think they know something and they did the right choice, even if they only use it to play minecraft.

    • @hardwire666too
      @hardwire666too 2 роки тому

      @@carlesv7219 For real. It's like I can seal with 10fps in the viewport, whjat I need is shorter rendertimes to help me itterate faster. lol

  • @St1ngerGuy
    @St1ngerGuy 2 роки тому +27

    Very interested in video export from unreal engine 5 using the movie render queue in a raytracing heavy scene. I have a 3090 right now and it does pretty good but the 4090 looks like it leaps ahead by quite a bit. Thanks for putting this video together.

    • @bikboi3292
      @bikboi3292 2 роки тому

      do you regret buying 3090?

    • @chillsoft
      @chillsoft 2 роки тому +1

      I have two 3090s, so this card is mute for me. Can't pack 2 of these in the case to get the double, no MB supports two because of how thick they are. Gonna stay with NVLinked 2x3090s and skip this generation unless I'm able to watercool them.

    • @zilverheart
      @zilverheart 2 роки тому +1

      @@chillsoft there exist a watercooled version of 4090

  • @klaus6474
    @klaus6474 2 роки тому +2

    I got the RTX S 5100
    CUDA cores: 41,984
    Boost clock: 2.3GHz
    Memory: 128GB GDDR6X
    Memory bus: 3760-bit
    Memory bandwidth: 2036GBps
    RT cores: 264 (3nd-gen)
    Tensor cores: 1042 (3rd-gen)
    NVLink SLI: No
    PCIe: Gen 5
    HDMI: 2.1
    HDCP: 2.3
    Display connectors: 2x HDMI 2.1, 4x DisplayPort 1.4
    Length: 15.3 inches
    Width: 6.0 inches
    Height: 4-slot
    Maximum GPU temp: 102
    Graphics card power: 460W
    Recommended power supply: 1000W
    Power connectors: 5x 8-pin (with supplied 26-pin adapter)

  • @enigmawstudios4130
    @enigmawstudios4130 2 роки тому +2

    You're always the go to for usable info on graphics cards. Everyone else is gaming

  • @GmanGavin1
    @GmanGavin1 2 роки тому +1

    Love the video format, the information in the video. This is exactly what I will send people whenever I have to explain why VRAM matters.

  • @whidzee
    @whidzee 2 роки тому +8

    i'd love to see the performance differences between all the 40XX cards

  • @TheRealLink
    @TheRealLink 2 роки тому +1

    As someone dabbling a lot with Blender and doing some freelance work, your graphs were very helpful, whether brute-forced or RT native. Great explanations! Subbed.

  • @marcusolivix
    @marcusolivix 2 роки тому +1

    Thanks! finally a review for creators. ...and yeah! Please a part two!!! It would be great to see how it performs in different render engines and different 3D software.

  • @maxrose8845
    @maxrose8845 2 роки тому +2

    Love the focus on creators - not enough of that. You're the man Sir Wade!

  • @pariahgaming365
    @pariahgaming365 2 роки тому +1

    I’m an animation student and I have a 3080 ti. I just finished my first ever shader render on MAYA. It’s was just 150 frames but it smoked my M1 Mac-Mini. Once my work-loads start getting super heavy later on, I’ll definitely upgrade but I should be good. Also my cpu is a Ryzen 9 5900x with 32 gigs of ddr4 ram. I’m sure I’ll be upgrading to either 64 or 128 gigs of ram in the near future

  • @joonglegamer9898
    @joonglegamer9898 2 роки тому +1

    There's a lot more to consider here, especially for the average 3D modelling animation enthusiast, if you look at your results from the Animated Frames Sprite Fright production files, you can clearly see it's not "double the performance", if anything it's barely 20 percent more with a 4090 card than an 3090, so if you're the average hobbyist it might not be such a huge deal to miss 20 percent performance, it's certainly NOT twice the performance. And that brings me to another thing - cost - these cards in Sweden were I live, cost around 26K Sek which translates to 2321 USD, most of us who bought the 3090 for around 2000 USD might not be THAT motivated to junk our cards and pay an extra 2.3K to get the difference. In a professional STUDIO setting I totally get the value, even a 5 percent difference can make or break some larger budgets with time constraints. You also have to realize that DLSS isn't used everywhere, thats more an Interpolation thing like those used in older television sets to draw the frames inbetween two extremes or rendered images.
    So in short - I don't think you will notice much difference when working with Blender cycles in the viewport when rotating and inspecting the scene. The biggest major upgrade was in fact from 1080ti to 3090 were you went from choppy slow movements to relatively real time. From 3090 to 4090 - that difference is not as HUGE as you make it sound here.
    Also in Blender, animation files (especially rigged ones) are very CPU bound and here it's actually better to have a better CPU.

  • @dominic.h.3363
    @dominic.h.3363 2 роки тому +1

    VRAM was the reason I went with a 3060 instead of a 3070, because whenever I would need to waste tens of hours using CPU+RAM as a fallback, wouldn't be worth it for the few tens of minutes saved with the faster 3070. It was very hard to justify replacing a broken GPU and being voluntarily fleeced with an almost $1k 3060 back at the height of the cryptomining craze, but seeing how fast viewport rendering is with a 4090, I'm tempted to bite the bullet once currency exchage rates stabilize.
    This review was everything I wanted from a creator perspective but never got from the usual outlets. Thanks!

  • @jamesquao1028
    @jamesquao1028 2 роки тому +1

    Thank you for making this video, I have purchased a 4090 thinking that maybe i have overspent. But as a 3D visualiser you have helped me justify with a smile that i made a great investment

  • @MattHalpain
    @MattHalpain 2 роки тому +1

    Great video. Super awesome to see the 4090 from an artist point of view.

  • @shermanwellons
    @shermanwellons Рік тому +4

    Part 2 for Cinema 4D and Redshift would be awesome for the 4090. I am thinking about replacing my 3090.

  • @Im_Ninooo
    @Im_Ninooo 2 роки тому +3

    3:25 that's one of the reasons why I bought the 12GB model of the 3060, so that I wouldn't have to worry about running out of memory anymore (as I sometimes did with my 4GB 1050 Ti)

    • @macksnotcool
      @macksnotcool 2 роки тому +3

      Wow, someone else who went from a 1050 to a 3060, nice

    • @wachocs16
      @wachocs16 2 роки тому +1

      I do mostly CAD and 3d scanning and modeling (you don't need that much vram very often). But renders take a lot
      It's a shame to only upgraded from a 1060 6GB to a 3070 8GB. I was really mad about there wasn't any model in existance of a 3070 12GB. And there was a lot of difference for the 3080 12GB

    • @Im_Ninooo
      @Im_Ninooo 2 роки тому

      @@macksnotcool I've had a 750 Ti for years, then a friend gave me his old 1050 Ti which I used for a few months before upgrading. worth every cent.

    • @Im_Ninooo
      @Im_Ninooo 2 роки тому

      @@wachocs16 yeah, that's why the 3060 was so appealing to me, it was reasonably priced and had more VRAM than a 3070 (which was actually quite expensive)

    • @CaptainScorpio24
      @CaptainScorpio24 2 роки тому +1

      @@Im_Ninooo i too did from 1060 6gb to rtx 3070 8gb ..
      wanted 3080 but it was power hungry and expensive too in mining boom.
      my Cooler master v650 watts 80 plus gold cud handle upto 3070 only. 😭

  • @amineamamra2603
    @amineamamra2603 2 роки тому +3

    Results are mind blowing !!
    Would love a part two testing embergen maybe and some houdini and karma gpu on it !!

  • @Speed_Monger
    @Speed_Monger 2 роки тому +2

    Finally someone showcasing what this card is actually meant for! Thanks

  • @procrastinator24
    @procrastinator24 2 роки тому +2

    Pleade part 2! Im looking at this card specifically for blender and the unreal engine :D thanks so much for the content!

  • @theshawnmccown
    @theshawnmccown 2 роки тому +1

    These kind of results are the selling point for me. It's really a great value when it performs this well in work and play.

  • @AndyMcMac
    @AndyMcMac 2 роки тому

    This is really helpful, thankyou! Everyone else just concentrates on games and that's not what we need.

  • @TGA_anim
    @TGA_anim 2 роки тому

    FINALLY ,this is what i was looking for ,not reviews that talk about video games and stuff,3D is my thing

  • @SATYAMKUMAROY
    @SATYAMKUMAROY 2 роки тому +1

    Needed this review. Very good content

  • @gerasimosioardanitis5494
    @gerasimosioardanitis5494 2 роки тому +6

    Now with the 4090 abandoning the NVLINK I m seriously considering skipping the 40x and waiting for the 50x with a hope they restore it. In the meantime I will seriously get my hands on 2 3090s and add them so I can get advantage of the 24+24=48GB with NVLINK.
    Imo I avoid quadro since I ain’t a studio owner or something. As an artist I can see a lot of value on rtx 2080s I had now 3090s I m planning on plus the 48gb with NVLINK for my budget is more than to make me happy and load my scenes and/or project. I ain’t getting crazy about if it will render in 12h instead of 8h.
    As long as I can improve my workload in a logical cost I am happy with it. As I mentioned above I invested on a good motherboard and a nice CPU that I believe serves 80% of the projects a blender artist needs. So in future I can add any 30x or 40x gpus but definitely I won’t spend 1600-2000 Europe prices for a 4090 that does give me a 1.7x of a 3090 and still get stuck with 24gb.
    For the same money I can get 2 x 3090s, Asus here costs 1200euros incl VAT and excluding VAT is 867€ x 2 = 1730 let’s say roughly. Let’s add an NVLINK 125€ that tops it at 1900€.
    I hope I don’t sound arrogant or biased but I cant see myself spending enormous amount of quadros.
    Farm rendering is still too expensive unless u r running a studio with lots of clients. If u r a solo artist I don’t see it as a solution for the time being. Maybe later that will be more competition around the market and prices become more reachable yes.

    • @Carlosmatos-nx4uc
      @Carlosmatos-nx4uc 2 роки тому +4

      Finally someone that not getting fooled by a shiny object. As you said you can buy 2 3090 for the price of one 4090. I myself have a 3090 working towards my second on. My biggest disappointment with the 4090 is that I was expecting it to be 32gb not 24gb.

  • @ChaosOver
    @ChaosOver 2 роки тому +1

    In a lot of workflows the artist is the limiting factor, not the hardware - even in LookDev. The real benefit will be in lighting and final rendering(if you are not working on complex shots that wont fit into your vram anyways).
    (Btw., if you are talking about rentability, the power draw is an important factor. So whats the min(idle/desktop), avg(working in the viewport), and max(rendering) power consumption? Whats the mix(desktop, viewport, render) in real world scenarios? How much does it draw in an example project per day? Compared to other cards? And how much does the room temp rise(believe me it does)? How much power is needed for the AC to cool it? These factors are also important to calculate your costs and to figure out whats the best solution for you.)

  • @fabianoperes2155
    @fabianoperes2155 2 роки тому

    Man, before I watch the video, I just wanted to say you are GORGEOUS! GOSH!!!

  • @MonstroInLA
    @MonstroInLA 6 місяців тому +1

    A6000 was surprisingly slower than I thought

  • @kellyshipman1341
    @kellyshipman1341 2 роки тому +1

    Fantastic video! Would definatly like to see some more.

  • @froggy3u
    @froggy3u Рік тому +3

    At around 3:10 minute of the video. I am asking just in case.
    Do you put your viewport in shading render preview while rendering the scene?
    I learned this the hard way 5 years ago, I remember I ran out of memory using my 1070 when I rendering some heavy scene using Cycles.
    I noticed that I put viewport sample and render sample as the same value at 4096.
    I was curious why was it able to use 6GB of VRAM in viewport shading but not in render.
    Turns out, when I was rendering the scene, both viewport and render windows using the same amount of vram.
    I decided to put my viewport in image editor layout, it staring render each frame under a 15 minute instead of "out of memory" error.
    Nowadays, even when using 3090...
    I always turn on the [temporary editor > image editor] in the settings and ctrl+space in one of the windows for the render (so that other windows are inactive while I am rendering).
    For my case it improves my render x3 to x4 ever since I was aware of that viewport shading also uses vram even while I am rendering.
    Frame 1: 18s vs 1min09s (with viewport shading in background) of a same scene settings I did from 5 years ago.
    Not sure if your case the same as mine. Hope this info helps could someone.

  • @otegadamagic
    @otegadamagic 2 роки тому +3

    Man thanks for being one of the very few to test for creators. Maybe do another that also shows benchmarks for editing softwares like davinci, premier and fcp.
    Cheers from Nigeria

    • @CreatorChaz
      @CreatorChaz 2 роки тому +1

      A youtuber named Eposvox has a video that might be what you're looking for. I hope that helps.

    • @otegadamagic
      @otegadamagic 2 роки тому +1

      @@CreatorChaz yeah thanks i saw his one before sir wade posted his. It would be good to get more people doing these benchmarks so we can compare i guess.

    • @CreatorChaz
      @CreatorChaz 2 роки тому +1

      @@otegadamagic Yeah, It's kinda rough finding non-gaming benchmarks sometimes. I hope more people pop up in the space.

    • @otegadamagic
      @otegadamagic 2 роки тому

      @@CreatorChaz yeah apparently NVidia cares more about gamers than content creators. No wonder they mainly sent test units to gamers for review

  • @mikechristiansen2000
    @mikechristiansen2000 2 роки тому +1

    I would be interested in a Houdini 4090 benchmark with mantra and karma.

  • @zombiecharger65
    @zombiecharger65 2 роки тому

    Glad someone finally addressed VRam and rendering. Everyone is always talking about gaming. The render time is a big thing but I need the most VRam that I can get and Nvidia throttles that on most models.n

  • @AndrewTanielian
    @AndrewTanielian 2 роки тому +1

    This is a great video! You explained all this very well.

  • @IrocZIV
    @IrocZIV 2 роки тому +5

    I like that you showed a bit about how it affects more than just rendering. I would like to see something similar showing how CPUs and RAM speed could impact sculpting performance and other features, in Blender.

  • @GabrielGabeRodriguez
    @GabrielGabeRodriguez 2 роки тому +4

    Great video. The algorithm populated this for me! Just wanted to speculate when you mentioned the 3090 taking 45 mins per frame, the 3000 Fe are notorious for having really bad thermal pads and worse alignment for the cooler in the early batches (2020). I recently bought a 3090 Fe and noticed the memory temperatures at thermal throttle (110+) because gddr6 and 6x have error correcting components if the memory heats up too much it can start to trip over itself and create errors that will slow down it's performance. Using quality thermal pads and trying to improve the cooler's seating has increased my thermal headroom on the memory and it's running at a nice cool 92C max (my alignment might not have been perfect as some other people reported 88C max temp with the most memory intensive applications...mining).

  • @MarkSiegemund
    @MarkSiegemund 2 роки тому +1

    6:37 Now i wanna Play Super Mario 64 so badly! Thanks!

  • @theredredvideo4189
    @theredredvideo4189 Рік тому

    Hands down one of the most comprehensive and useful reviews / deep dives on the 4090. Subbed, Liked and please do a part 2 on C4D!

  • @zaydraco
    @zaydraco 2 роки тому

    This is the first serious content creator review, not just for UA-camrs

  • @LordLab
    @LordLab 2 роки тому +3

    will be nice to see tests like this on new CPUs like AMD AM5 lineup and Intel 13Gen

  • @Didjelirium
    @Didjelirium 2 роки тому

    I cannot wait to try this card in Blender but for now the closest I got to a 4090 was by downloading a 3D model of it then zooming in on the details. XD

  • @jasonhoi85
    @jasonhoi85 2 роки тому

    Thanks! This is the best benchmark for 3d artists.

  • @tonymoore3122
    @tonymoore3122 Рік тому

    Super helpful as I'm considering purchasing a 4090 or 4080. Thanks so much!

  • @kimmysander2447
    @kimmysander2447 2 роки тому +2

    Id love to see a part 2!

  • @reeeyou
    @reeeyou 2 роки тому

    Thank you so much for including A6000, so many youtubers dont even include a6000 to compare for 3d rendering given that their vram is the same compared to 4090.

  • @louixlinart
    @louixlinart 2 роки тому +2

    This is such a helpful video. I'm in the midst of thinking if I should upgrade to a 3090 from a 3060 or just take a big leap to a 4090. Plus coming from someone that not only knows about pc specs but also does 3D themselves is a lot more reliable than just to watch some random benchmark videos. Thank you for making this video. Cheers!

    • @hman6159
      @hman6159 Рік тому

      What did you end up doing

    • @louixlinart
      @louixlinart Рік тому +1

      @@hman6159 nth yet 😆saving still HAHAHA

  • @TheNerd
    @TheNerd 2 роки тому +5

    1 year ago I switched from a 1080 to a 3080. It blew my mind when I realized, I'm able to move a fully rigged (human) character of 300.000 faces in the cycles viewport (with denoising) in the middle of a "Kids Bedroom Scene" with A LOT of stuff in it.
    Sure at a couple FPS, but a couple years ago this was just simply unthinkable and fully impossible.

  • @gamin546
    @gamin546 2 роки тому +1

    Finally, a video benchmarking the RTX 4090 in animation and rendering, hopefully this dude gets more views and subscribers because he honestly deserves it.

  • @nerukas86
    @nerukas86 2 роки тому +1

    Great video, that's what i call quality content! Thank you.

  • @bac483
    @bac483 2 роки тому

    Just came from a 1080ti , got a 3090ti both the company paid for so Im happy for now, and 4090ti is right around the corner not to mention 5000 series. Good video!

  • @khoifoto
    @khoifoto Рік тому +1

    The fun stuff is when my friends and I went into Micro Center and each of us walk out with a 4090, people curse at us for being scalpers. Little did they know we're just building out own rendering stations :( . This card is a blessing.

  • @MrPablosek
    @MrPablosek 2 роки тому +1

    The speeds are insane.
    I only have a gtx 1060 6gb and rendering is very slow on that thing. Seeing the speed of 3070 was a huge eye opener, let alone the 4090!

  • @pgplaysvidya
    @pgplaysvidya 2 роки тому +1

    because of the large amount of gamers (probably) i've always had issues with the famous top tech youtubers not telling me the performance uplift on the software i use. finally!
    ps thanks to Paul from newegg for linking this video :P

  • @rnbpl
    @rnbpl 2 роки тому +1

    5:05 when GPU's run out of VRAM, they can sometimes use system RAM (out of core) but then it slows down a lot. maybe that's what happened

  • @farazshababi1
    @farazshababi1 2 роки тому

    I love the comparison to render farms ... i think a lot of people wanna know if they should invest in the hardware themselves! Very helpful! THANKS SIR

  • @flyinggecko3322
    @flyinggecko3322 2 роки тому +1

    Great video! Please do take a look at Unreal and an even further look into blender, like different types of viewport settings and final render times at 4K would be amazing!

  • @AtoomikDzn
    @AtoomikDzn 2 роки тому +3

    Yeeees i wanted this video so much, thanks !

  • @MikaiGamer1286
    @MikaiGamer1286 2 роки тому

    Glad i found you with this herdly ever see 3d artists benchmark these cards and i havent done it in years now even after graduating with a degree and everything as theirs no work in florida for it

  • @andrewferguson2221
    @andrewferguson2221 2 роки тому +10

    Ugh. I was really hoping that the card wasn't that good. Nvidia's been getting a little too bold lately, and I was more than ready to write this card off. But I was wondering what made you test at 16k instead of 4k? Either way, I love the video and hope you make a part 2! Thanks Sir Wade!

    • @SirWade
      @SirWade  2 роки тому +4

      I mostly just wanted to push the GPUs further to see if the performance scaled - I did a 4K / 16K test 2 years ago and figured I'd just stick with it :P Glad you liked it!

    • @amanda.collaud
      @amanda.collaud 2 роки тому

      @@SirWade that favours the 4090 ...

  • @R0NINnoodles
    @R0NINnoodles 2 роки тому +1

    would love to see a part 2 - specifically rendering in unreal engine 5

  • @mahadevovnl
    @mahadevovnl 2 роки тому +1

    I dunno about RTX and such but my dude, your beard is magnificent. I want to know your tricks. How do you keep it so nicely trimmed? Barber or DIY?

  • @geekydomstudios
    @geekydomstudios 2 роки тому

    Please make a part 2! This was an amazing video. 🙂

  • @IndyStry
    @IndyStry Рік тому +2

    Great video, waiting to see this exact comparison for Unreal Engine 5.1 with a scene with really heavy 8k textures that push the VRAM. Also BEAWARE that the 4000 series cards don't NVlink in case someone is thinking of getting two. (well you physically won't be able to fit two anyways) This might be why two A6000 or two 3090s might be a better bet for some looking to stack another card later down the line to get more future performance.

    • @KirillPodcast
      @KirillPodcast День тому

      Факт. Две 3090/3090Ti с мостом NvLink быстрее одной 4090 😁

  • @SyntaxDomain
    @SyntaxDomain 2 роки тому +1

    Great video! I'd love to see a Houdini and substance painter video as well if you get a chance.

  • @DrivenKeys
    @DrivenKeys 2 роки тому +6

    Thank you for this, Sir. I'd like to see the Part 2. Eventually, if you can get them, I'm interested in how the 4080 cards will do with pro apps. I animate, so fast rendering isn't that important to me, but the Omniverse benefits with Unreal integration are very tempting. I already know I'll have to upgrade my 3060ti soon, it's extremely refreshing to hear an artist's view on what to choose. I like daydreaming of AMD competing here, but it really doesn't seem realistic. What do you think?

    • @CaptainScorpio24
      @CaptainScorpio24 2 роки тому

      how's your 3060 ti working

    • @DrivenKeys
      @DrivenKeys 2 роки тому +1

      @@CaptainScorpio24 So far, I enjoy it. If you're only animating with well designed rigs with Maya's viewport 2.0, it's good enough, but anything beyond that might want more power. Currently, I'm attending Animation Mentor, which uses very efficient rigs. The 3060ti paired with a Ryzen 5600x allows me to view my animation in Maya's viewer 2.0 with lights and textures. Of course, playblast is more accurate, but the viewer plays well enough as you tackle notes. Once I graduate AM, I'll want to explore more projects similar to what Sir is tackling on his channel. As he pointed out in the video, a 3070 couldn't handle rendering some files. That said, you only need gpu-only rendering for faster render times. If you're limited on budget, 3060ti will do for a while, and it games pretty well with ray tracing games at 1080p. GPU prices are going to go down soon, as the 4080 cards release next month and AMD introduces its new generation. So, if you can wait, I recommend holding out for a 3070 or 3080, or a less expensive 3060ti. Also, if I didn't care about ray tracing and Nvidia's Omniverse, I would have tried AMD. For the same price, they have more power and memory. That said, I love ray tracing, so I don't regret the purchase.

    • @rVox-Dei
      @rVox-Dei 2 роки тому

      Late to the party but AMD has been making a lot of updates and its been getting better, they're also finally adding hardware ray tracing with 3.5 in blender

    • @DrivenKeys
      @DrivenKeys 2 роки тому

      @@rVox-Dei Yes, I'm very interested in RDNA 3's improvements. They're getting Ray Tracing sorted, but people say CUDA still has a vast pro advantage. So, the general thought is that, for a 24gb card, a 3090 will still be a better choice than a 7900xtx, but we have yet to see if that's the case. I have a close eye on the 4080 (costs about the same as 3090ti), but I'm concerned about the 16gb vs 24 for rendering large scenes.

  • @amanda.collaud
    @amanda.collaud 2 роки тому +2

    He rendered in 16K resolution which favours the 4090. No one noticed?

  • @Medwedius
    @Medwedius 2 роки тому +1

    I want to see a comparison in Unreal. BTW thanks for Maya tests. My pipeline is Maya + Unreal.

  • @lazerith840
    @lazerith840 2 роки тому

    I’m a 3d artist and more power is always useful. Faster rendering is the key to managing workload. It would be a no brainer for me to want this card.

  • @caninac
    @caninac 2 роки тому

    It would be interesting also, to do a calculation based off the power consumption for a animated render, which would give an indication as to how much air conditioning would be needed if there was a renderfarm of these guys.
    Also, with these sorts of tests, optix denoiser would be a better choice over open image denoise. yes open image denoise is a better denoiser, but its CPU bound not GPU bound, which skews the results

  • @vialgoenlapantalla
    @vialgoenlapantalla 2 роки тому +1

    The beard looks really good on you!

  • @raven_glass
    @raven_glass 2 роки тому +2

    Worth factoring in the cost of a new PSU and (probably) case when calculating the cost of a 4090 since most rigs will not be able to handle or fit one of these new cards on an older system that had previously supported a 3000 series.

  • @SimonC021
    @SimonC021 2 роки тому

    Thanks for this! Very tempted to get one.

  • @jooaquin
    @jooaquin 2 роки тому

    Loved the review! Please do another one with C4D and Octane benchmarks!

  • @dantheman1998
    @dantheman1998 2 роки тому +1

    I figured that the 4090 was going to sell out at the price it did for this exact reason. Professionals

  • @crazy4cinnamon
    @crazy4cinnamon 2 роки тому

    just built my very first pc with a 3060 less than half a yr ago 😅 I do VFX, with Maya, Nuke, Houdini, and Unreal

  • @TempyEdits
    @TempyEdits 2 роки тому +2

    This video was off the charts

  • @shitshow_1
    @shitshow_1 2 роки тому

    Honestly, this is one of the great contents I have watched. Thanks for posting : D

  • @legendhasit2568
    @legendhasit2568 2 роки тому +1

    Make part two already!!!

  • @ivanoleaanimator
    @ivanoleaanimator 2 роки тому

    I think there also needs to be a discussion on the power consumption of a machine that uses the 4090. From the top of my head I believe that it ranges from around 350 watts up to 600w which is like running a hand held vacuum cleaner while you render and half that at idle.

  • @soraaoixxthebluesky
    @soraaoixxthebluesky Рік тому +1

    Me getting into 3D animation scene now looking at 24Gb RAM doesn’t seem ridiculous, the fact it’s the opposite. 24Gb of ram look fairly conservative.
    Now I know why people are mad when GPU manufacturers or vendors increase their price because some people do all these things for a living. Not for fun.

  • @mechaboy95
    @mechaboy95 2 роки тому +1

    you can use some of your own pc ram as dedicated video ram if you gpu doesn't have enough to render a scene.
    its in some windows setting, but I'm not sure how much it helps