Clarisse Angie is also great because eventually you will be able to render by using CPU+GPU simultaneously!! That will give artists a lot more speed !! :)
I love Clarisse and Angie, its less a C4D sized software than a framework to build scene and shots, but it doesn't end up crashing from lack of VRAM. I'm constantly surprised at the tools and the 'aha!' thinking as they work. Clarisse's flexibility in its workflow is unmatched. I can't get Angie to work with my GPU though, I have a Titan X 16GB and Game drivers installed and its on the supported list but Its not working.
Great video. I have been watching Clarisse for at least 7 years with interest but have never taken the dive. Angie sounds interesting. I ditched CPU rendering for GPU/ Octane about 8 years ago and then started doing more work in Redshift as many of my broadcast and agency clients were making that part of their pipeline for production rendering. Long story short I have been GPU all the way for the past 8 years and then last yeah Arnold GPU finally gained parity with its CPU version and I just fell in love with Arnold’s final frame image quality. I have never had an easier time matching CG to backplates than I have with Arnold. Unfortunately some of Arnold’s GPU features are not quite there yet. Trace sets are missing and you don’t have a lot of control over sampling beyond AA when using GPU. Addicted to the look I started testing its CPU rendering out and on my 3990x 64 core and found that it’s about 25% of the speed I have come to expect from an RTX 3090 when GPU rendering. I have two nodes with a total of 8x 3090s and even with all that GPU rendering power I have come to prefer Arnold CPU even if I have to wait a lot longer. I know I’m all over the place with this but one of the things I have come to appreciate about CPU rendering is the stability and huge datasets I can render without running into VRAM limitations or going out of core. Which brings me back to Clarisse IFX. Clarisse’s claim to fame is the huge data sets it can render. With Angie/GPU is it still as capable of handling massive scenes? Also how do you like the look compared to other production renderers? I’m not a huge fan of Redshift’s look - yes you can get a good image out of it but it takes a lot of fiddling where Arnold takes almost none. Octane is a little better but it has a look to it that I can pick out of a lineup 9 times out of ten. I think I remember reading that Clarisse is using Autodesks Principled shader? I could be wrong but if so hopefully it borrowed tech from Autodesks acquisition of Solid Angle / Arnold a while back. The gold standard in my mind included Arnold / Mantra … I know Clarisse has quite the credit list when it comes to feature films - how would you say Angie stacks up? Sorry long list of q’s but I have followed your work for sometime now and respect your opinion. Thanks again!
Wow..You've basically listed every issue I've had with GPU rendering. GPU was like the holy grail and I've shelled out ALOT of money on multi GPU systems over the years..But after a recent project which was super stressful due to the barriers you mentioned like reaching VRAM limits very easily in Octane. I've actually lost clients because I've been in very awkward meetings where they have asked for 'more details' and I'm like 'I can't do that as my machine crashes if I add even 1 more tree!' or 'can you make the trees blow in the wind' - 'Nope, you're lucky to have trees in there making them move would be render suicide' So environments look sparse and the client picks up on it...I've had to sacrifice motion blur just to get it rendering (as even doing an MV pass means it's still got to compute..) I've had to compress ALL textures from 4k to 1k before just to get a project rendering and meet a deadline...All these sacrifices add up to poor results in the end! Even Unreal 5 is handling HUGE scenes with ease on mid level systems, with billions of poly/textures etc. I think GPU engines have become popular way too fast and it's stifled innovation as OTOY are already making sheds of cash from the Instagram crowd..I clearly learnt the hard way that it's not a production rendererer. I'm going through different tools to try and find something else so I can actually do my job, and not be limited. It's funny because Octane is the fastest by far at handling a small scene with standard materials. But the Blender Guru did a test video where they tried to test all the engines fairly and the one that came out on top for a scene which encapsulated everything (inside/outside GI/refractions/SSS/Volumes etc) it was actually VRAY that came out on top in terms of speed. Where Octane shines in raw scene rendering, it's lost as soon as you add FOG/volumes/SSS or a lot of geo, and completely falls apart at rendering large scale scenes. I too am curious to know if Angie has any VRAM limitations compared to the CPU..Although I couldn't imagine they'd want to put out something that capped you when that IS what makes Clarise so appealing....Maybe it's still CPU based with GPU acceleration or maybe the streaming thing has been made to work with the GPU too?
actually Blender Guru did not a good job with testing Luxcore... since some weeks i use it (with CPU) all the time now, the results are so much more realistic than Cycles or Octane, only drawback is the speed (viewports)
If it were my profession, id go the clarisse route, but since I´m a hobbyist, the price tag simply is too prohibitive. That, and Unreal 5 being free, nanite and lumen pretty much does what clarisse does for distance scenes(with scenes with bokeh and motion blur, you would have to pixel peep to know the difference). I guess you could comp the close up with blender renders and get the job done... with free software basically! Amazing times we live in...
Matt Damon of VFX
That is what I’m thinking too, lol
Stop trying to sell me on Clarisse!....cuz it's working! 🤣
It's a shame that Isotropix ghosted us
2 years later and I still miss Clarisse. What a damn shame it just disappeared.
so sad :(
This is a sad video now, RIP Clarisse
Clarisse Angie is also great because eventually you will be able to render by using CPU+GPU simultaneously!! That will give artists a lot more speed !! :)
I love Clarisse and Angie, its less a C4D sized software than a framework to build scene and shots, but it doesn't end up crashing from lack of VRAM. I'm constantly surprised at the tools and the 'aha!' thinking as they work. Clarisse's flexibility in its workflow is unmatched. I can't get Angie to work with my GPU though, I have a Titan X 16GB and Game drivers installed and its on the supported list but Its not working.
Make sure your are using the latest driver and check your nvidia control panel and make sure clarisse is using your gpus
Very cool dude, can't wait to see what you will do with it !
Glad to see your video after so many months
Please update this info on Clarisse and Angie
time per frame or noise threshold .. LuxCore Render does the same, i love that feature
What happened to clarisse and suddenly shut down?
Thanks for sharing these amazing tool. I'm absolutely loved, and trying it for sure. Would you talk about how Clarisse work with Houdini?
aged fantastically BUAHAHA
Great video. I have been watching Clarisse for at least 7 years with interest but have never taken the dive. Angie sounds interesting. I ditched CPU rendering for GPU/ Octane about 8 years ago and then started doing more work in Redshift as many of my broadcast and agency clients were making that part of their pipeline for production rendering. Long story short I have been GPU all the way for the past 8 years and then last yeah Arnold GPU finally gained parity with its CPU version and I just fell in love with Arnold’s final frame image quality.
I have never had an easier time matching CG to backplates than I have with Arnold. Unfortunately some of Arnold’s GPU features are not quite there yet. Trace sets are missing and you don’t have a lot of control over sampling beyond AA when using GPU. Addicted to the look I started testing its CPU rendering out and on my 3990x 64 core and found that it’s about 25% of the speed I have come to expect from an RTX 3090 when GPU rendering.
I have two nodes with a total of 8x 3090s and even with all that GPU rendering power I have come to prefer Arnold CPU even if I have to wait a lot longer.
I know I’m all over the place with this but one of the things I have come to appreciate about CPU rendering is the stability and huge datasets I can render without running into VRAM limitations or going out of core.
Which brings me back to Clarisse IFX. Clarisse’s claim to fame is the huge data sets it can render. With Angie/GPU is it still as capable of handling massive scenes? Also how do you like the look compared to other production renderers? I’m not a huge fan of Redshift’s look - yes you can get a good image out of it but it takes a lot of fiddling where Arnold takes almost none. Octane is a little better but it has a look to it that I can pick out of a lineup 9 times out of ten.
I think I remember reading that Clarisse is using Autodesks Principled shader? I could be wrong but if so hopefully it borrowed tech from Autodesks acquisition of Solid Angle / Arnold a while back.
The gold standard in my mind included Arnold / Mantra … I know Clarisse has quite the credit list when it comes to feature films - how would you say Angie stacks up?
Sorry long list of q’s but I have followed your work for sometime now and respect your opinion. Thanks again!
Wow..You've basically listed every issue I've had with GPU rendering. GPU was like the holy grail and I've shelled out ALOT of money on multi GPU systems over the years..But after a recent project which was super stressful due to the barriers you mentioned like reaching VRAM limits very easily in Octane. I've actually lost clients because I've been in very awkward meetings where they have asked for 'more details' and I'm like 'I can't do that as my machine crashes if I add even 1 more tree!' or 'can you make the trees blow in the wind' - 'Nope, you're lucky to have trees in there making them move would be render suicide' So environments look sparse and the client picks up on it...I've had to sacrifice motion blur just to get it rendering (as even doing an MV pass means it's still got to compute..) I've had to compress ALL textures from 4k to 1k before just to get a project rendering and meet a deadline...All these sacrifices add up to poor results in the end! Even Unreal 5 is handling HUGE scenes with ease on mid level systems, with billions of poly/textures etc. I think GPU engines have become popular way too fast and it's stifled innovation as OTOY are already making sheds of cash from the Instagram crowd..I clearly learnt the hard way that it's not a production rendererer. I'm going through different tools to try and find something else so I can actually do my job, and not be limited.
It's funny because Octane is the fastest by far at handling a small scene with standard materials. But the Blender Guru did a test video where they tried to test all the engines fairly and the one that came out on top for a scene which encapsulated everything (inside/outside GI/refractions/SSS/Volumes etc) it was actually VRAY that came out on top in terms of speed. Where Octane shines in raw scene rendering, it's lost as soon as you add FOG/volumes/SSS or a lot of geo, and completely falls apart at rendering large scale scenes.
I too am curious to know if Angie has any VRAM limitations compared to the CPU..Although I couldn't imagine they'd want to put out something that capped you when that IS what makes Clarise so appealing....Maybe it's still CPU based with GPU acceleration or maybe the streaming thing has been made to work with the GPU too?
actually Blender Guru did not a good job with testing Luxcore... since some weeks i use it (with CPU) all the time now, the results are so much more realistic than Cycles or Octane, only drawback is the speed (viewports)
Looks cool, thank you.
What version of clarisse are you talking about?
Thanks mate for this video!
If it were my profession, id go the clarisse route, but since I´m a hobbyist, the price tag simply is too prohibitive. That, and Unreal 5 being free, nanite and lumen pretty much does what clarisse does for distance scenes(with scenes with bokeh and motion blur, you would have to pixel peep to know the difference). I guess you could comp the close up with blender renders and get the job done... with free software basically! Amazing times we live in...
Where did you go to for training videos when you started with Clarisse?
is there other good alternative to clarisse
Probably solaris.
Unreal 5 maybe? 🤔
Can Clarisse read pointcloud files?
Such Sad News
💪👍
And now Clarisse is dead.