How FAST is the RTX 4090 for 3D Animation + Rendering??

Поділитися
Вставка
  • Опубліковано 5 жов 2024

КОМЕНТАРІ • 724

  • @SirWade
    @SirWade  2 роки тому +96

    What do you think of the results so far?? Would you get use out of these speeds? And anything you'd like to see me cover next time? :)

    • @NightVisionOfficial
      @NightVisionOfficial 2 роки тому +2

      Well ... i want to do more complex sculpting on Blender, and learn UE5 faster, since with my GPU is too slow to get any patience of learning it :/. So... i guess it's a yes. Also, i was looking into Substance Painter, but i have many materials in Blender, so, Baking textures quicker would be good!

    • @SSPBradley11
      @SSPBradley11 2 роки тому +4

      If you complete the test again I'd be interested in seeing how it handles particles for special effects.

    • @rVox-Dei
      @rVox-Dei 2 роки тому

      I would love to see a review on how far the HiP backend has come, can't wait until radeon 7000 becuase nvidia does have a monopoly in blender rn

    • @adnanerochdi6982
      @adnanerochdi6982 2 роки тому +4

      Of course this is a giant leap in perf gen-to-gen, and it is worth it for someone who needs to double their speeds right away. Thank you Wade sm for the video by the way. But the value proposition that rendering farms provide makes one rethink before making the purchase especially if u have to upgrade the rest of ur gear like PSUs and MoBo. This is the main points for me, especially when a 3090 still provides one of the fastest viewport and rendering speeds.
      On a side note, and I think this is so important, 4090 owners can't double their VRAM with a second GPU like 3090/ti, which is worth considering if you're already stepping on the limits of 24GBs of video ram. And this alone lets me wonder, probably we can expect something next year that's gonna have NVLINK and slightly faster speeds. It's really exciting that now we can take super fast rendering for granted and how cheap it has become. Thanks again Wade, it was an exciting and a complete video.

    • @garrygiomarelli3476
      @garrygiomarelli3476 2 роки тому +6

      would be interesting to see 4090 vs multiple 3090s combined in the same machine. and some more info on power consumption comparisons when working on big projects.

  • @johntnguyen1976
    @johntnguyen1976 2 роки тому +326

    Probably the most useful of the huge influx of 4090 videos happening right now...cuz all we ever get are stats, and catering to the gamers. Thanks for putting one out for the creatives!

    • @Oscar4u69
      @Oscar4u69 2 роки тому +42

      I got bored of all the reviews talking only about games, and that's just a waste of GPU, games don't need that much power, the real use for a GPU is in things like this

    • @ishiddddd4783
      @ishiddddd4783 2 роки тому +23

      @@Oscar4u69 if they are to play at 4k 100fps + it's not really a waste, especially since atm it's the only gpu that can do so natively with modern titles

    • @Anti-FreedomD.P.R.ofSouthKorea
      @Anti-FreedomD.P.R.ofSouthKorea 2 роки тому +5

      @@ishiddddd4783 but there are not many great games to play that suit this performance right now. I would rather play gmod than to play any of the stuffs made for 11, 12 yr olds that's being marketed "4K max settings" currently

    • @ishiddddd4783
      @ishiddddd4783 2 роки тому

      @@Anti-FreedomD.P.R.ofSouthKorea k, but that's you and gmod runs in 4k with almost a decade old hardware

    • @lavart7043
      @lavart7043 Рік тому

      How is it usefel while no Radeon GPUs are included

  • @雪鷹魚英語培訓的領航
    @雪鷹魚英語培訓的領航 2 роки тому +309

    Really cool that you got access to these cards like Digital Foundry / Gamers Nexus / et cetera. Those guys aren't focusing on the artist tools like you are, so it's really nice to see that aspect explored here! Definitely interested in Unreal / Houdini / Davinci/Fusion.

    • @n00buo
      @n00buo 2 роки тому +5

      nvidia is desperate to sell this scam card, they'll send cards to anyone for a few bucks so they can lie to people

    • @RazielXT
      @RazielXT 2 роки тому +33

      @@n00buo 4080 12gb is scam card, 4090 is beast

    • @n00buo
      @n00buo 2 роки тому

      @@RazielXT 3080 is the best card so far, 4000 series can't compete with Ampere they're just heaters for tards with money

    • @user9267
      @user9267 2 роки тому +13

      @@n00buo
      4090 seems to be a pretty decent deal

    • @n00buo
      @n00buo 2 роки тому +2

      @@user9267 HAHHAHA nvidia got bots for youtube comments, they know.

  • @JetCooper3D
    @JetCooper3D Рік тому +64

    We work on Disney and Marvel films at Pinewood Studios. After doing similar tests, we changed our RTX 3090 cards over to 4090's. We stopped buying Quadro cards years ago.
    Great video - subscribed - thank you!

    • @nyahbinghiman5984
      @nyahbinghiman5984 Рік тому +1

      Why you stopped buying Quadros?

    • @ahmetkocoval1375
      @ahmetkocoval1375 Рік тому +2

      ​@@nyahbinghiman5984muchhhhhhhh dollars 😂

    • @ichisenzy
      @ichisenzy Рік тому +6

      tell your boss to make actual good movies

    • @insertname7458
      @insertname7458 Рік тому

      good job g, i dont usually watch films or shi but i appreciate yall being able to make all that realistic even like 10 20 years ago

  • @mjlagrone
    @mjlagrone 2 роки тому +94

    Yes please for the Part 2. I would especially like see how it compares in Blender when you have a lot of hair and subsurface scattering! And maybe also with a giant pile of grass and other vegetation.

    • @SnakeTheCowboy
      @SnakeTheCowboy 2 роки тому +4

      All hands up for more Blender testing!

  • @feloi3033
    @feloi3033 Рік тому +33

    only this guy can say, "mom I need 4090 for homework".

    • @cryogenicheart2019
      @cryogenicheart2019 6 місяців тому +1

      Quickest way to get your parents to take you out of animation school and put you in a real university

    • @trungtechreview627
      @trungtechreview627 3 місяці тому

      @@cryogenicheart2019after graduating from an animation university, I found more interests in tech and pc, not animation :)

  • @KillahMate
    @KillahMate 2 роки тому +124

    Note: if you're using Cycles (as opposed to Eevee) it's *all* raytraced. It's a path tracing renderer which means that every sample for every pixel has been raytraced, and has therefore gone through the RTX hardware pipeline - the only difference with reflective surfaces is how coherent the rays are. To test non-raytracing performance you'd need to use Eevee.

    • @Deezsirrer
      @Deezsirrer 2 роки тому +6

      Note: scenes layer is awesome you can render cycles and eevee together

    • @Pixel_FX
      @Pixel_FX 2 роки тому +2

      Ray tracing and path tracing are two different things.

    • @KillahMate
      @KillahMate 2 роки тому +8

      @@Pixel_FX They are two related things - one is a subset of the other. The important bit is that if you have an RTX GPU and Cycles is configured to make use of it via OptiX for hardware acceleration, then the Cycles path samples are being calculated on the GPU's RT cores. And since everything Cycles does is path samples, then _everything_ is rendered with RT cores.
      This is unlike most video games, which must run in real time and therefore only use RT when they have to, like for reflections and such, and never use RT to do path tracing because it's still too demanding and slow for real time.

    • @SirWade
      @SirWade  Рік тому +13

      I misspoke - I was talking about the shaders not being reflection / refraction-heavy in that scene. The scene didn't require much complex calculation compared to something like the Maya render later in the video

    • @BeheadedKamikaze
      @BeheadedKamikaze Рік тому +14

      @@SirWade Diffuse lighting is *more* complex to calculate than specular reflections. As @KillahMate is trying to explain, this is how path tracing works - a diffuse surface is really just a crap-ton of reflections, all from different directions, and the colour is averaged over hundreds of samples until it becomes smooth. Whereas a specular shader reflects all the rays in more or less the same angle so it turns into a clean result much more quickly. You are getting confused with game rendering terminology. Path tracing is *all* reflections. 100%. It doesn't matter how many specular surfaces there are. And every single one of those rays is calculated using the RT cores.

  • @fxadam
    @fxadam 2 роки тому +70

    Great video. Picked up the RTX 4090 today and it is incredible at hardware rendering in Arnold, Blender, Keyshot etc. Games are fun but this GPU is excellent for content creation.

    • @SW-fh7he
      @SW-fh7he 2 роки тому +2

      How did you get it?

    • @fxadam
      @fxadam 2 роки тому +2

      @@SW-fh7he Walked into Microcenter on launch day. They had plenty. They're sold out now but they should have more shortly. Apparently Nvidia is sending out links to Geforce Experience users that will allow them to easily order a 4090 from Best Buy without having to deal with the bots that are slamming best buy right now.

    • @checkmymovie
      @checkmymovie Рік тому +1

      I gonna pick up mine today and build a whole new computer for Daz Studio because I'm character designer.

    • @zubairalam2795
      @zubairalam2795 Рік тому

      Also the psu... :/ and the wattage is hugee!!!

    • @Citizen1482608
      @Citizen1482608 27 днів тому

      No 4090 will live as long as a quadro (RTX A6000) for example under stressful work loads daily rendering as I use them. One of the main reasons why I was afraid to buy one.

  • @thevoid6756
    @thevoid6756 Рік тому +8

    The "Why Vram Matters" chapter is like the hidden gem of this video. Glad Paul recommended your channel.

  • @techdraconis
    @techdraconis 2 роки тому +92

    I would love to see a part 2 with unreal and houdini.

  • @LiyoungMartin
    @LiyoungMartin 2 роки тому +42

    Finally, an in-depth analysis of 4090 performance for 3d workflow!!! Could you pretty pleeeease (as you mentioned earlier in the video) do a separate video for unreal engine? Thanks!

    • @flyinggecko3322
      @flyinggecko3322 2 роки тому +3

      Yes, Unreal and even further look into blender, like different types viewport settings and final renders at 4K would be amazing!

    • @zubairalam2795
      @zubairalam2795 Рік тому

      Thats what im looking for as well.. thnxx mate

  • @user-ly1en7kl2o
    @user-ly1en7kl2o 2 роки тому +9

    Great you're back, can't wait to see more of your animations.

  • @DonC876
    @DonC876 2 роки тому +10

    I think another angle of looking at this is efficiency. If you have your computer compute a lot then this will also add to the power bill fast. Just yesterday i saw a review where they tried to find the best efficiency by slightly underclocking (about 150mhz) and undervolting and they got the power consumption down almost to 3090 levels (roughly from 400W to 300W) with only a few percent of performance lost. That would mean that you basically double your efficiency in that case. So there's another cost factor that can make that investment worthwhile even quicker.

  • @gcharb2d
    @gcharb2d Рік тому +21

    That's why I got the 12 GB RTX 3060 instead of the 8 GB RTX 3070, a tad slower, but cheaper, and it handles larger scenes!
    Great video!

    • @MIchaelSybi
      @MIchaelSybi Рік тому +3

      I got gtx 680 with 4 gb instead of 2, and it served me some more years than it would otherwise, as many programs had 4gb as a bare minimum with time.

    • @Musaibavr
      @Musaibavr Місяць тому

      I bought RTX3060 12GB instead of RTX4060 8GB for 3Ds Max.

  • @St1ngerGuy
    @St1ngerGuy 2 роки тому +26

    Very interested in video export from unreal engine 5 using the movie render queue in a raytracing heavy scene. I have a 3090 right now and it does pretty good but the 4090 looks like it leaps ahead by quite a bit. Thanks for putting this video together.

    • @bikboi3292
      @bikboi3292 2 роки тому

      do you regret buying 3090?

    • @chillsoft
      @chillsoft 2 роки тому +1

      I have two 3090s, so this card is mute for me. Can't pack 2 of these in the case to get the double, no MB supports two because of how thick they are. Gonna stay with NVLinked 2x3090s and skip this generation unless I'm able to watercool them.

    • @zilverheart
      @zilverheart 2 роки тому +1

      @@chillsoft there exist a watercooled version of 4090

  • @3dduff
    @3dduff 2 роки тому +5

    yes please make a part 2. I am a freelance who does use Maya/c4d/Houdini, and render mostly with Redshift. I have my own 5 system minifarm stocked with 30xx GPUS. But a big part of my time is spent in simulations. Only a few situation excist that can speed up simulations with GPUs, but I would very much like to see some of these run on a 4090.
    Great video as always, keep up the good work.

  • @rcarter1690
    @rcarter1690 Рік тому +16

    Finally some real world tests that really show why an animator would spend so much on a card like this. That 3 minute short test is the best I’ve seen that no other UA-camrs seem to understand. Thank you!

  • @Bunderwahl
    @Bunderwahl Рік тому +4

    Part 2 please, would be really cool if you could include 4080 and 7900 XTX, and even better if you could add video production and other creative applications!

  • @whidzee
    @whidzee Рік тому +8

    i'd love to see the performance differences between all the 40XX cards

  • @hanselespinosa8918
    @hanselespinosa8918 2 роки тому +6

    As an artist working with a 1080 Ti in the current year. I don't even fully grasp the amount of creative decisions I could make with this card if I was able to afford it. I agree with the statement in regards to the gaming community. The conversation about the 40 series ends the same way it does every year, more frames equals better performance better gaming experience. That's it. For the creative community it means time budgets can be allocated differently. When you mentioned the difference of 9 hours, man, 9 hours for sound design or post processing in general can make a huge difference.
    Really good review. First time I check out the channel, thanks for sharing.

  • @marcusolivix
    @marcusolivix 2 роки тому +1

    Thanks! finally a review for creators. ...and yeah! Please a part two!!! It would be great to see how it performs in different render engines and different 3D software.

  • @ChaosOver
    @ChaosOver Рік тому +1

    In a lot of workflows the artist is the limiting factor, not the hardware - even in LookDev. The real benefit will be in lighting and final rendering(if you are not working on complex shots that wont fit into your vram anyways).
    (Btw., if you are talking about rentability, the power draw is an important factor. So whats the min(idle/desktop), avg(working in the viewport), and max(rendering) power consumption? Whats the mix(desktop, viewport, render) in real world scenarios? How much does it draw in an example project per day? Compared to other cards? And how much does the room temp rise(believe me it does)? How much power is needed for the AC to cool it? These factors are also important to calculate your costs and to figure out whats the best solution for you.)

  • @hardwire666too
    @hardwire666too 2 роки тому +4

    I am so glad you talked about an actual animation rendering benchmark. I have gotten in countless arguments with people about how a single still frame tells me nothing about how a video card will perform for my needs as an artist. All that tells me is how well that video card will render that one single frame with the most optimal settings. It tells me basically nothing. People just DON'T understand that the settings used for one frame migh not be great for the next frame. So one frame might render in 30sec, but the next might render in a 1min30sec, and the frame after that might take 5min. So that single-frame bench mark is utterly useless. So thank you. I fell vindicated. lol.
    Also on the note of renderfarms Blender has a fairly popular one called sheepit where you can use your own hardware to earn time on the render farm for youru own projects.

    • @carlesv7219
      @carlesv7219 Рік тому

      Imagine when you try to tell people that rendering scenes has nothing to do with actual viewport performance while you're working creating that scene, but people loves to see simplifications and numbers just to think they know something and they did the right choice, even if they only use it to play minecraft.

    • @hardwire666too
      @hardwire666too Рік тому

      @@carlesv7219 For real. It's like I can seal with 10fps in the viewport, whjat I need is shorter rendertimes to help me itterate faster. lol

  • @pariahgaming365
    @pariahgaming365 Рік тому +1

    I’m an animation student and I have a 3080 ti. I just finished my first ever shader render on MAYA. It’s was just 150 frames but it smoked my M1 Mac-Mini. Once my work-loads start getting super heavy later on, I’ll definitely upgrade but I should be good. Also my cpu is a Ryzen 9 5900x with 32 gigs of ddr4 ram. I’m sure I’ll be upgrading to either 64 or 128 gigs of ram in the near future

  • @shermanwellons
    @shermanwellons Рік тому +4

    Part 2 for Cinema 4D and Redshift would be awesome for the 4090. I am thinking about replacing my 3090.

  • @amineamamra2603
    @amineamamra2603 2 роки тому +3

    Results are mind blowing !!
    Would love a part two testing embergen maybe and some houdini and karma gpu on it !!

  • @enigmawstudios4130
    @enigmawstudios4130 2 роки тому +2

    You're always the go to for usable info on graphics cards. Everyone else is gaming

  • @owenjenkinsofficial
    @owenjenkinsofficial 2 роки тому +2

    DO a part 2!

  • @andrewferguson2221
    @andrewferguson2221 2 роки тому +10

    Ugh. I was really hoping that the card wasn't that good. Nvidia's been getting a little too bold lately, and I was more than ready to write this card off. But I was wondering what made you test at 16k instead of 4k? Either way, I love the video and hope you make a part 2! Thanks Sir Wade!

    • @SirWade
      @SirWade  2 роки тому +4

      I mostly just wanted to push the GPUs further to see if the performance scaled - I did a 4K / 16K test 2 years ago and figured I'd just stick with it :P Glad you liked it!

    • @amanda.collaud
      @amanda.collaud Рік тому

      @@SirWade that favours the 4090 ...

  • @pmAdministrator
    @pmAdministrator Рік тому +1

    You're absolutely right! Thank you for the video. For us, who use these cards for work, these cards are INSANE!

  • @mikechristiansen2000
    @mikechristiansen2000 Рік тому +1

    I would be interested in a Houdini 4090 benchmark with mantra and karma.

  • @GmanGavin1
    @GmanGavin1 2 роки тому +1

    Love the video format, the information in the video. This is exactly what I will send people whenever I have to explain why VRAM matters.

  • @TheRealLink
    @TheRealLink 2 роки тому +1

    As someone dabbling a lot with Blender and doing some freelance work, your graphs were very helpful, whether brute-forced or RT native. Great explanations! Subbed.

  • @mixtapechi
    @mixtapechi 2 роки тому +30

    It's good to see someone making benchmarks on creative programs rather than games. Thanks!

  • @gerasimosioardanitis5494
    @gerasimosioardanitis5494 2 роки тому +6

    Now with the 4090 abandoning the NVLINK I m seriously considering skipping the 40x and waiting for the 50x with a hope they restore it. In the meantime I will seriously get my hands on 2 3090s and add them so I can get advantage of the 24+24=48GB with NVLINK.
    Imo I avoid quadro since I ain’t a studio owner or something. As an artist I can see a lot of value on rtx 2080s I had now 3090s I m planning on plus the 48gb with NVLINK for my budget is more than to make me happy and load my scenes and/or project. I ain’t getting crazy about if it will render in 12h instead of 8h.
    As long as I can improve my workload in a logical cost I am happy with it. As I mentioned above I invested on a good motherboard and a nice CPU that I believe serves 80% of the projects a blender artist needs. So in future I can add any 30x or 40x gpus but definitely I won’t spend 1600-2000 Europe prices for a 4090 that does give me a 1.7x of a 3090 and still get stuck with 24gb.
    For the same money I can get 2 x 3090s, Asus here costs 1200euros incl VAT and excluding VAT is 867€ x 2 = 1730 let’s say roughly. Let’s add an NVLINK 125€ that tops it at 1900€.
    I hope I don’t sound arrogant or biased but I cant see myself spending enormous amount of quadros.
    Farm rendering is still too expensive unless u r running a studio with lots of clients. If u r a solo artist I don’t see it as a solution for the time being. Maybe later that will be more competition around the market and prices become more reachable yes.

    • @Carlosmatos-nx4uc
      @Carlosmatos-nx4uc 2 роки тому +4

      Finally someone that not getting fooled by a shiny object. As you said you can buy 2 3090 for the price of one 4090. I myself have a 3090 working towards my second on. My biggest disappointment with the 4090 is that I was expecting it to be 32gb not 24gb.

  • @joonglegamer9898
    @joonglegamer9898 Рік тому +1

    There's a lot more to consider here, especially for the average 3D modelling animation enthusiast, if you look at your results from the Animated Frames Sprite Fright production files, you can clearly see it's not "double the performance", if anything it's barely 20 percent more with a 4090 card than an 3090, so if you're the average hobbyist it might not be such a huge deal to miss 20 percent performance, it's certainly NOT twice the performance. And that brings me to another thing - cost - these cards in Sweden were I live, cost around 26K Sek which translates to 2321 USD, most of us who bought the 3090 for around 2000 USD might not be THAT motivated to junk our cards and pay an extra 2.3K to get the difference. In a professional STUDIO setting I totally get the value, even a 5 percent difference can make or break some larger budgets with time constraints. You also have to realize that DLSS isn't used everywhere, thats more an Interpolation thing like those used in older television sets to draw the frames inbetween two extremes or rendered images.
    So in short - I don't think you will notice much difference when working with Blender cycles in the viewport when rotating and inspecting the scene. The biggest major upgrade was in fact from 1080ti to 3090 were you went from choppy slow movements to relatively real time. From 3090 to 4090 - that difference is not as HUGE as you make it sound here.
    Also in Blender, animation files (especially rigged ones) are very CPU bound and here it's actually better to have a better CPU.

  • @maxrose8845
    @maxrose8845 Рік тому +2

    Love the focus on creators - not enough of that. You're the man Sir Wade!

  • @procrastinator24
    @procrastinator24 2 роки тому +2

    Pleade part 2! Im looking at this card specifically for blender and the unreal engine :D thanks so much for the content!

  • @kellyshipman1341
    @kellyshipman1341 2 роки тому +1

    Fantastic video! Would definatly like to see some more.

  • @Im_Ninooo
    @Im_Ninooo 2 роки тому +3

    3:25 that's one of the reasons why I bought the 12GB model of the 3060, so that I wouldn't have to worry about running out of memory anymore (as I sometimes did with my 4GB 1050 Ti)

    • @macksnotcool
      @macksnotcool 2 роки тому +3

      Wow, someone else who went from a 1050 to a 3060, nice

    • @wachocs16
      @wachocs16 2 роки тому +1

      I do mostly CAD and 3d scanning and modeling (you don't need that much vram very often). But renders take a lot
      It's a shame to only upgraded from a 1060 6GB to a 3070 8GB. I was really mad about there wasn't any model in existance of a 3070 12GB. And there was a lot of difference for the 3080 12GB

    • @Im_Ninooo
      @Im_Ninooo 2 роки тому

      @@macksnotcool I've had a 750 Ti for years, then a friend gave me his old 1050 Ti which I used for a few months before upgrading. worth every cent.

    • @Im_Ninooo
      @Im_Ninooo 2 роки тому

      @@wachocs16 yeah, that's why the 3060 was so appealing to me, it was reasonably priced and had more VRAM than a 3070 (which was actually quite expensive)

    • @CaptainScorpio24
      @CaptainScorpio24 2 роки тому +1

      @@Im_Ninooo i too did from 1060 6gb to rtx 3070 8gb ..
      wanted 3080 but it was power hungry and expensive too in mining boom.
      my Cooler master v650 watts 80 plus gold cud handle upto 3070 only. 😭

  • @DrivenKeys
    @DrivenKeys 2 роки тому +6

    Thank you for this, Sir. I'd like to see the Part 2. Eventually, if you can get them, I'm interested in how the 4080 cards will do with pro apps. I animate, so fast rendering isn't that important to me, but the Omniverse benefits with Unreal integration are very tempting. I already know I'll have to upgrade my 3060ti soon, it's extremely refreshing to hear an artist's view on what to choose. I like daydreaming of AMD competing here, but it really doesn't seem realistic. What do you think?

    • @CaptainScorpio24
      @CaptainScorpio24 2 роки тому

      how's your 3060 ti working

    • @DrivenKeys
      @DrivenKeys 2 роки тому +1

      @@CaptainScorpio24 So far, I enjoy it. If you're only animating with well designed rigs with Maya's viewport 2.0, it's good enough, but anything beyond that might want more power. Currently, I'm attending Animation Mentor, which uses very efficient rigs. The 3060ti paired with a Ryzen 5600x allows me to view my animation in Maya's viewer 2.0 with lights and textures. Of course, playblast is more accurate, but the viewer plays well enough as you tackle notes. Once I graduate AM, I'll want to explore more projects similar to what Sir is tackling on his channel. As he pointed out in the video, a 3070 couldn't handle rendering some files. That said, you only need gpu-only rendering for faster render times. If you're limited on budget, 3060ti will do for a while, and it games pretty well with ray tracing games at 1080p. GPU prices are going to go down soon, as the 4080 cards release next month and AMD introduces its new generation. So, if you can wait, I recommend holding out for a 3070 or 3080, or a less expensive 3060ti. Also, if I didn't care about ray tracing and Nvidia's Omniverse, I would have tried AMD. For the same price, they have more power and memory. That said, I love ray tracing, so I don't regret the purchase.

    • @rVox-Dei
      @rVox-Dei Рік тому

      Late to the party but AMD has been making a lot of updates and its been getting better, they're also finally adding hardware ray tracing with 3.5 in blender

    • @DrivenKeys
      @DrivenKeys Рік тому

      @@rVox-Dei Yes, I'm very interested in RDNA 3's improvements. They're getting Ray Tracing sorted, but people say CUDA still has a vast pro advantage. So, the general thought is that, for a 24gb card, a 3090 will still be a better choice than a 7900xtx, but we have yet to see if that's the case. I have a close eye on the 4080 (costs about the same as 3090ti), but I'm concerned about the 16gb vs 24 for rendering large scenes.

  • @kimmysander2447
    @kimmysander2447 2 роки тому +2

    Id love to see a part 2!

  • @dominic.h.3363
    @dominic.h.3363 Рік тому +1

    VRAM was the reason I went with a 3060 instead of a 3070, because whenever I would need to waste tens of hours using CPU+RAM as a fallback, wouldn't be worth it for the few tens of minutes saved with the faster 3070. It was very hard to justify replacing a broken GPU and being voluntarily fleeced with an almost $1k 3060 back at the height of the cryptomining craze, but seeing how fast viewport rendering is with a 4090, I'm tempted to bite the bullet once currency exchage rates stabilize.
    This review was everything I wanted from a creator perspective but never got from the usual outlets. Thanks!

  • @otegadamagic
    @otegadamagic 2 роки тому +3

    Man thanks for being one of the very few to test for creators. Maybe do another that also shows benchmarks for editing softwares like davinci, premier and fcp.
    Cheers from Nigeria

    • @CreatorChaz
      @CreatorChaz 2 роки тому +1

      A youtuber named Eposvox has a video that might be what you're looking for. I hope that helps.

    • @otegadamagic
      @otegadamagic 2 роки тому +1

      @@CreatorChaz yeah thanks i saw his one before sir wade posted his. It would be good to get more people doing these benchmarks so we can compare i guess.

    • @CreatorChaz
      @CreatorChaz 2 роки тому +1

      @@otegadamagic Yeah, It's kinda rough finding non-gaming benchmarks sometimes. I hope more people pop up in the space.

    • @otegadamagic
      @otegadamagic 2 роки тому

      @@CreatorChaz yeah apparently NVidia cares more about gamers than content creators. No wonder they mainly sent test units to gamers for review

  • @SATYAMKUMAROY
    @SATYAMKUMAROY 2 роки тому +1

    Needed this review. Very good content

  • @flyinggecko3322
    @flyinggecko3322 2 роки тому +1

    Great video! Please do take a look at Unreal and an even further look into blender, like different types of viewport settings and final render times at 4K would be amazing!

  • @Speed_Monger
    @Speed_Monger 2 роки тому +2

    Finally someone showcasing what this card is actually meant for! Thanks

  • @TheNerd
    @TheNerd 2 роки тому +5

    1 year ago I switched from a 1080 to a 3080. It blew my mind when I realized, I'm able to move a fully rigged (human) character of 300.000 faces in the cycles viewport (with denoising) in the middle of a "Kids Bedroom Scene" with A LOT of stuff in it.
    Sure at a couple FPS, but a couple years ago this was just simply unthinkable and fully impossible.

  • @LordLab
    @LordLab 2 роки тому +3

    will be nice to see tests like this on new CPUs like AMD AM5 lineup and Intel 13Gen

  • @theshawnmccown
    @theshawnmccown Рік тому +1

    These kind of results are the selling point for me. It's really a great value when it performs this well in work and play.

  • @mechaboy95
    @mechaboy95 2 роки тому +1

    you can use some of your own pc ram as dedicated video ram if you gpu doesn't have enough to render a scene.
    its in some windows setting, but I'm not sure how much it helps

  • @AndrewTanielian
    @AndrewTanielian Рік тому +1

    This is a great video! You explained all this very well.

  • @MarkSiegemund
    @MarkSiegemund Рік тому +1

    6:37 Now i wanna Play Super Mario 64 so badly! Thanks!

  • @TheStrandedAlliance
    @TheStrandedAlliance Рік тому +1

    I wonder if at some point Cycles becomes faster than Eevee in VR rendering. Unfortunately, Eevee is still poorly optimized, and hasn't received many updates since its debut.

  • @theredredvideo4189
    @theredredvideo4189 Рік тому

    Hands down one of the most comprehensive and useful reviews / deep dives on the 4090. Subbed, Liked and please do a part 2 on C4D!

  • @IrocZIV
    @IrocZIV 2 роки тому +5

    I like that you showed a bit about how it affects more than just rendering. I would like to see something similar showing how CPUs and RAM speed could impact sculpting performance and other features, in Blender.

  • @rnbpl
    @rnbpl 2 роки тому +1

    5:05 when GPU's run out of VRAM, they can sometimes use system RAM (out of core) but then it slows down a lot. maybe that's what happened

  • @MattHalpain
    @MattHalpain Рік тому +1

    Great video. Super awesome to see the 4090 from an artist point of view.

  • @dkphantomdk
    @dkphantomdk Рік тому +1

    would like to see the same with the new Amd Radeon rx7900xtx / rx6950xt / PRO W6800 just to compare, I know they might not be as good as the 4090, since the 7900 / 6950 are gaming gpus, but the PRO W6800 might be interesting !

  • @froggy3u
    @froggy3u Рік тому +3

    At around 3:10 minute of the video. I am asking just in case.
    Do you put your viewport in shading render preview while rendering the scene?
    I learned this the hard way 5 years ago, I remember I ran out of memory using my 1070 when I rendering some heavy scene using Cycles.
    I noticed that I put viewport sample and render sample as the same value at 4096.
    I was curious why was it able to use 6GB of VRAM in viewport shading but not in render.
    Turns out, when I was rendering the scene, both viewport and render windows using the same amount of vram.
    I decided to put my viewport in image editor layout, it staring render each frame under a 15 minute instead of "out of memory" error.
    Nowadays, even when using 3090...
    I always turn on the [temporary editor > image editor] in the settings and ctrl+space in one of the windows for the render (so that other windows are inactive while I am rendering).
    For my case it improves my render x3 to x4 ever since I was aware of that viewport shading also uses vram even while I am rendering.
    Frame 1: 18s vs 1min09s (with viewport shading in background) of a same scene settings I did from 5 years ago.
    Not sure if your case the same as mine. Hope this info helps could someone.

  • @R0NINnoodles
    @R0NINnoodles 2 роки тому +1

    would love to see a part 2 - specifically rendering in unreal engine 5

  • @MrTomanonamous
    @MrTomanonamous 2 роки тому

    This is really helpful, thankyou! Everyone else just concentrates on games and that's not what we need.

  • @louixlinart
    @louixlinart Рік тому +2

    This is such a helpful video. I'm in the midst of thinking if I should upgrade to a 3090 from a 3060 or just take a big leap to a 4090. Plus coming from someone that not only knows about pc specs but also does 3D themselves is a lot more reliable than just to watch some random benchmark videos. Thank you for making this video. Cheers!

    • @hman6159
      @hman6159 Рік тому

      What did you end up doing

    • @louixlinart
      @louixlinart Рік тому +1

      @@hman6159 nth yet 😆saving still HAHAHA

  • @raven_glass
    @raven_glass Рік тому +2

    Worth factoring in the cost of a new PSU and (probably) case when calculating the cost of a 4090 since most rigs will not be able to handle or fit one of these new cards on an older system that had previously supported a 3000 series.

  • @fabianoperes2155
    @fabianoperes2155 2 роки тому

    Man, before I watch the video, I just wanted to say you are GORGEOUS! GOSH!!!

  • @IndyStry
    @IndyStry Рік тому +1

    Great video, waiting to see this exact comparison for Unreal Engine 5.1 with a scene with really heavy 8k textures that push the VRAM. Also BEAWARE that the 4000 series cards don't NVlink in case someone is thinking of getting two. (well you physically won't be able to fit two anyways) This might be why two A6000 or two 3090s might be a better bet for some looking to stack another card later down the line to get more future performance.

  • @khoifoto
    @khoifoto Рік тому +1

    The fun stuff is when my friends and I went into Micro Center and each of us walk out with a 4090, people curse at us for being scalpers. Little did they know we're just building out own rendering stations :( . This card is a blessing.

  • @charliesteer
    @charliesteer 2 роки тому +2

    I would love to see it tested on Houdini's Karma XPU. Particularly viewport rendering and rendering of transmissive surfaces. It's still in beta but I'm using it as my main renderer.

    • @alexmehler6765
      @alexmehler6765 Рік тому

      they all use optix as backend so benefits would be similar across the board , see vray , etc.

  • @klaus6474
    @klaus6474 Рік тому +1

    I got the RTX S 5100
    CUDA cores: 41,984
    Boost clock: 2.3GHz
    Memory: 128GB GDDR6X
    Memory bus: 3760-bit
    Memory bandwidth: 2036GBps
    RT cores: 264 (3nd-gen)
    Tensor cores: 1042 (3rd-gen)
    NVLink SLI: No
    PCIe: Gen 5
    HDMI: 2.1
    HDCP: 2.3
    Display connectors: 2x HDMI 2.1, 4x DisplayPort 1.4
    Length: 15.3 inches
    Width: 6.0 inches
    Height: 4-slot
    Maximum GPU temp: 102
    Graphics card power: 460W
    Recommended power supply: 1000W
    Power connectors: 5x 8-pin (with supplied 26-pin adapter)

  • @jamesquao1028
    @jamesquao1028 2 роки тому +1

    Thank you for making this video, I have purchased a 4090 thinking that maybe i have overspent. But as a 3D visualiser you have helped me justify with a smile that i made a great investment

  • @TempyEdits
    @TempyEdits 2 роки тому +2

    This video was off the charts

  • @SyntaxDomain
    @SyntaxDomain 2 роки тому +1

    Great video! I'd love to see a Houdini and substance painter video as well if you get a chance.

  • @zaydraco
    @zaydraco Рік тому

    This is the first serious content creator review, not just for UA-camrs

  • @amanda.collaud
    @amanda.collaud Рік тому +2

    He rendered in 16K resolution which favours the 4090. No one noticed?

  • @bikelifewithalex120
    @bikelifewithalex120 Місяць тому

    Great topic, not many videos out here on the production side of high end GPU cards.

  • @dantheman1998
    @dantheman1998 Рік тому +1

    I figured that the 4090 was going to sell out at the price it did for this exact reason. Professionals

  • @GabrielGabeRodriguez
    @GabrielGabeRodriguez 2 роки тому +4

    Great video. The algorithm populated this for me! Just wanted to speculate when you mentioned the 3090 taking 45 mins per frame, the 3000 Fe are notorious for having really bad thermal pads and worse alignment for the cooler in the early batches (2020). I recently bought a 3090 Fe and noticed the memory temperatures at thermal throttle (110+) because gddr6 and 6x have error correcting components if the memory heats up too much it can start to trip over itself and create errors that will slow down it's performance. Using quality thermal pads and trying to improve the cooler's seating has increased my thermal headroom on the memory and it's running at a nice cool 92C max (my alignment might not have been perfect as some other people reported 88C max temp with the most memory intensive applications...mining).

  • @Didjelirium
    @Didjelirium Рік тому

    I cannot wait to try this card in Blender but for now the closest I got to a 4090 was by downloading a 3D model of it then zooming in on the details. XD

  • @MikaiGamer1286
    @MikaiGamer1286 2 роки тому

    Glad i found you with this herdly ever see 3d artists benchmark these cards and i havent done it in years now even after graduating with a degree and everything as theirs no work in florida for it

  • @TGA_anim
    @TGA_anim 2 роки тому

    FINALLY ,this is what i was looking for ,not reviews that talk about video games and stuff,3D is my thing

  • @ivanoleaanimator
    @ivanoleaanimator 2 роки тому

    I think there also needs to be a discussion on the power consumption of a machine that uses the 4090. From the top of my head I believe that it ranges from around 350 watts up to 600w which is like running a hand held vacuum cleaner while you render and half that at idle.

  • @nerukas86
    @nerukas86 2 роки тому +1

    Great video, that's what i call quality content! Thank you.

  • @blazbohinc4964
    @blazbohinc4964 Рік тому

    You correction is only half correct.
    If Blender was swapping memory to RAM, then there's no way it would take that long. However, if it was swapping to SSD, then yes. Idk what your setting was. Usually, VRAM issues are easily avoidable if you calculate with tiling. Takes longer depending on the speed of your SSD, but if the GPU doesn't have to store everything in VRAM.. you can render almost anything.

  • @gurratell7326
    @gurratell7326 2 роки тому +1

    Huh? EVERYTHING in Cycles is raytraced, so that Junkshop scene do fully use the RT core. And rendering with sharing the system memory works flawlessly, I've done HUGE scenes in really high res that's way more complicated than that simple one you had with my 3060ti in way shorter time than yours. So yeah, you're doing something wrong :)

    • @SirWade
      @SirWade  2 роки тому

      Apparently I've got some research to do on the Hybrid VRAM front! I suspect my 2yr old test had something to do with the animated meshes in the scene though - alembic cache files likely caused some of the issues. But on the RT side I mostly meant that the Junkshop shaders aren't quite as reflection / refraction heavy as something like the Maya example file :)

  • @lhbbq
    @lhbbq Місяць тому +1

    @Sir Wade Neistadt-Thank you for sharing. I like the A6, but is it unfair to compare it to a 4090? When you look at the A6000 at 300 watts and the 4090 at 450/600 watts of power, what kind of results would you get IF the A6000 had that kind of power? This would allow you to almost compare the cards apples to apples, up to 24GB of VRAM.
    Your thoughts?

  • @danny3man
    @danny3man 2 роки тому +1

    Very nice, i`m also interested in benchmarks using Vray, Fstorm or Octane(if they have a standalone benchmark).

  • @MrPablosek
    @MrPablosek Рік тому +1

    The speeds are insane.
    I only have a gtx 1060 6gb and rendering is very slow on that thing. Seeing the speed of 3070 was a huge eye opener, let alone the 4090!

  • @sweardog
    @sweardog 2 роки тому +1

    Cycles runnin like Evee on the 4090

  • @SuperQuesty
    @SuperQuesty Рік тому +1

    Beyond fast, Beyond your wallets, Beyond the skill.

  • @caninac
    @caninac Рік тому

    It would be interesting also, to do a calculation based off the power consumption for a animated render, which would give an indication as to how much air conditioning would be needed if there was a renderfarm of these guys.
    Also, with these sorts of tests, optix denoiser would be a better choice over open image denoise. yes open image denoise is a better denoiser, but its CPU bound not GPU bound, which skews the results

  • @geekydomstudios
    @geekydomstudios Рік тому

    Please make a part 2! This was an amazing video. 🙂

  • @brabes76
    @brabes76 Рік тому

    Is great review thanks for sharing
    I would also like to say nice transition effect when you did the correction moment

  • @albertmunoz9075
    @albertmunoz9075 Рік тому +1

    Loved the video! Would love to see you test it with Houdini as well. And would also love a comparision with the 4080 when it comes out. :)

    • @houdinipdf8147
      @houdinipdf8147 Рік тому +1

      Yes! Thank you, very instructive.
      I have to say that I get tired of seeing bench’s for « game/gaming » in all videos while this card is addressed for pros in my opinion.
      Albert same request here, plus I would be curious to see a comparison on heavy fx sim as flip or pyro solver with 3090/4090 and 4080.
      If you can, it would be awesome to add the electricity consumption :)

  • @Citizen1482608
    @Citizen1482608 27 днів тому

    The only reason I ever bought an RTX A6000 was the 48GB and I do tend to fill that up in my DAZ 3D scenes. I am trying to figure out if I should buy a 4090 system now with a 5GHZ CPU or wait for the 5090's.

  • @jasonhoi85
    @jasonhoi85 Рік тому

    Thanks! This is the best benchmark for 3d artists.

  • @o.b.a6035
    @o.b.a6035 2 роки тому

    Loved the video and i watched the last one of the 4090😁😁😁 gotta make a part 2 video

  • @gamin546
    @gamin546 2 роки тому +1

    Finally, a video benchmarking the RTX 4090 in animation and rendering, hopefully this dude gets more views and subscribers because he honestly deserves it.

  • @taimuralix
    @taimuralix Рік тому +1

    Can't wait to see the 4060 & 4070 and their performance per power consumption

  • @novakmiler7944
    @novakmiler7944 2 роки тому +1

    Can you compare it to 2 3090s? :)
    That's currently what I'm running. Wondering if it'll make any difference. Blender user

  • @othoapproto9603
    @othoapproto9603 Рік тому

    it's 1/27/23 Just built a new PC with an AMD 7059x + 128gb RAM + RTX 4090 with all current Win 11pro OS and Nivida drivers. I can't f12 render in Cycles or Viewport. I've tried to many things to list.