Hey Chief, I'm a fan of your work. I decided to get an eGPU for my laptop after watching your 'rendering on mining risers' video, logic being that if a pcie X 1 lane isn't gonna be a problem, then thunderbolt wouldn't be an issue. I have an rtx 2060 now, and I noticed that in the classroom scene, I get a render time of 43.26 seconds, on blender 4.0. With a memory overclock of +1200 it goes down to 37.86, which is almost a 15% increase. Core overclocks don't do anything though. Just wanted to share that, since I noticed when you overclocked the gpus I think you only did the core. Have a nice one man, enjoy the ungulate characters, (I'm a my little pony guy myself lol)
I have always had issues running OC for blender. Sometimes it becomes unstable. But I'm glad you got good results. And I'm glad you like the characters haha!
Hey really appreciate your content, I also had a slightly off topic question (for this video at least). I noticed you had done render benchmark testing on the 7900 XTX, and was wondering if in recent months if AMD has made any strides in other general blender performance? I've been trying to get a reference for how well the XTX does in high SubD sculpting versus top 40 series cards since i pretty much can't find any good benchmarks for sculpting or viewport performance for AMD cards anywhere online.
Hey no worries, ask away wherever you want! AMD GPUs have not had any changes in performance yet. I am hoping by Blender 4.3 they will fully implement HIP-RT. But even then, not sure how much difference it would make. I will test it right away on my 7800 XT once it's out of beta. Also, for sculpting, unless I am wrong, the GPU is not much of a factor for performance. Viewport rendering with textures is definitely dependent on the GPU, but sculpting, simulation, modeling, etc. are all CPU heavy tasks. So I will test what I can soon and let you know!
@@ContradictionDesign That's really good to hear! Also thank you for testing in advance, that would definitely put some of my worries at ease. I'm currently using a baseline 2080, so as long as I would be getting any performance boost to my current workflow, just really being highpoly sculpting (sometimes in the millions of quads) and retopo, I'd be happy with swapping over.
@@NFNSSynergy Yeah so for sculpting I would get a higher end CPU and fast RAM before a GPU. But the 7900 XTX is a fair bit faster than the 2080 so it would still be an upgrade.
@@ContradictionDesign I probably should've started with the fact that I do probably like, 60% Gaming and 40% Productivity on my rig lol, that was why I was so headstrong on making sure it was a good enough GPU. I was looking to get the 7950X3D and 32gb of 6000mhz DDR5, which both is going to be a massive upgrade from my current setup (i7-8700k + 32gbs 3200mh DDR4). I feel like there's so much fear mongering against AMD products online that people legitimately had me convinced that this setup would have worse performance in blender, until now of course lol. Thank you again for the advice btw.
@@NFNSSynergy oh I see. For gaming AMD is great. AMD GPUs are just slower per dollar for final rendering. But if you don't do tons of renders, that doesn't even matter. So for you the 7900 XTX sounds like a good fit!
Hey thank you! The 4060 is ok just because it has RT cores. I think for beginners it would be ok. But it doesn't take much to use up 8 GB of VRAM, so you'd have to know what size your scenes need.
What do you mean by “ there are always bottlenecks”. Do you mean the i5 13400f will bottleneck the 4070? I plan to use it for 3d stuff like unreal engine and blender. Probably some games like Fortnite and valorant too.
@@Happymaxie so for you I think that combo is plenty. You should get good fps in games and have plenty of rendering power and cores for 3d. Running 300+ fps in games would cause the CPU to be the hold up, but few people are doing that yet I think.
@@Happymaxie very likely fake. Unless they are scared they won't be able to sell it once the supers are here. But I would avoid that in general. Is it online?
Your tutorials are immensely helpful for leaning non human models. Thanks!
You are welcome! Thanks for your feedback!
Hey Chief, I'm a fan of your work. I decided to get an eGPU for my laptop after watching your 'rendering on mining risers' video, logic being that if a pcie X 1 lane isn't gonna be a problem, then thunderbolt wouldn't be an issue.
I have an rtx 2060 now, and I noticed that in the classroom scene, I get a render time of 43.26 seconds, on blender 4.0. With a memory overclock of +1200 it goes down to 37.86, which is almost a 15% increase. Core overclocks don't do anything though. Just wanted to share that, since I noticed when you overclocked the gpus I think you only did the core.
Have a nice one man, enjoy the ungulate characters, (I'm a my little pony guy myself lol)
I have always had issues running OC for blender. Sometimes it becomes unstable. But I'm glad you got good results.
And I'm glad you like the characters haha!
Hey really appreciate your content, I also had a slightly off topic question (for this video at least). I noticed you had done render benchmark testing on the 7900 XTX, and was wondering if in recent months if AMD has made any strides in other general blender performance? I've been trying to get a reference for how well the XTX does in high SubD sculpting versus top 40 series cards since i pretty much can't find any good benchmarks for sculpting or viewport performance for AMD cards anywhere online.
Hey no worries, ask away wherever you want! AMD GPUs have not had any changes in performance yet. I am hoping by Blender 4.3 they will fully implement HIP-RT. But even then, not sure how much difference it would make. I will test it right away on my 7800 XT once it's out of beta.
Also, for sculpting, unless I am wrong, the GPU is not much of a factor for performance. Viewport rendering with textures is definitely dependent on the GPU, but sculpting, simulation, modeling, etc. are all CPU heavy tasks.
So I will test what I can soon and let you know!
@@ContradictionDesign That's really good to hear! Also thank you for testing in advance, that would definitely put some of my worries at ease. I'm currently using a baseline 2080, so as long as I would be getting any performance boost to my current workflow, just really being highpoly sculpting (sometimes in the millions of quads) and retopo, I'd be happy with swapping over.
@@NFNSSynergy Yeah so for sculpting I would get a higher end CPU and fast RAM before a GPU. But the 7900 XTX is a fair bit faster than the 2080 so it would still be an upgrade.
@@ContradictionDesign I probably should've started with the fact that I do probably like, 60% Gaming and 40% Productivity on my rig lol, that was why I was so headstrong on making sure it was a good enough GPU. I was looking to get the 7950X3D and 32gb of 6000mhz DDR5, which both is going to be a massive upgrade from my current setup (i7-8700k + 32gbs 3200mh DDR4). I feel like there's so much fear mongering against AMD products online that people legitimately had me convinced that this setup would have worse performance in blender, until now of course lol. Thank you again for the advice btw.
@@NFNSSynergy oh I see. For gaming AMD is great. AMD GPUs are just slower per dollar for final rendering. But if you don't do tons of renders, that doesn't even matter. So for you the 7900 XTX sounds like a good fit!
Hey been loving content so far I wonder is the rtx 4060 8gb enough for beginner freelance work?
Hey thank you! The 4060 is ok just because it has RT cores. I think for beginners it would be ok. But it doesn't take much to use up 8 GB of VRAM, so you'd have to know what size your scenes need.
I’ve been wondering if the i5 13400f is enough for the rtx 4070 or do I need to upgrade to the i5 13500?
Well there are always bottlenecks but those two should be just fine together. Are you using it for 3d work or games or what?
What do you mean by “ there are always bottlenecks”. Do you mean the i5 13400f will bottleneck the 4070? I plan to use it for 3d stuff like unreal engine and blender. Probably some games like Fortnite and valorant too.
@@Happymaxie so for you I think that combo is plenty. You should get good fps in games and have plenty of rendering power and cores for 3d. Running 300+ fps in games would cause the CPU to be the hold up, but few people are doing that yet I think.
Found a used 4070 for 300 dollars. Is it legit?
@@Happymaxie very likely fake. Unless they are scared they won't be able to sell it once the supers are here. But I would avoid that in general. Is it online?