I feel like you guys were really apologetic about entering the monetisation program of youtube but in my opinion, it's something that I do not want you to apologise for. You are making quality free content for everyone and watching an ad is the least I can do to pay you back for it.
The Z orientation on different programs has a reason. It depends if the 3d program comes from an evolution of an engineer/CAD in mind first or not. 3dsMax was the way to bring Autocad files to 3D. Autocad has the Z up Because... originally people draw the 2d plans of a house in paper in a desk, so the original X/Y was the "Floor". Then thanks to computers the 3d dimension was possible, and this way Z was up. Maya/Houdini has not this background, the X/Y is the plane created in your "monitor" the third dimension is profundity. Blender... I dont know. XD.
Would love to upvote a camera animation tutorial! It's always been something I've been unsure of how to get right and is so fundamental to many shots. Good moment to also add my thanks for so many hours of invaluable helpful tutorials.
I do vfx in houdini from home at a studio that uses maya. In order to get point attributes/particles into maya, I export out my effects as .rs proxies (since they use redshift, .rs is a redshift scene description file). In maya in the hypershade redshift has userdatascalar/userdatavector that can be plugged into the material. So with that method I am able to get particles/point attributes into maya. Of course the downside is you can't change the geometry once you export to a scene description file. The cool thing about redshift is once you buy a license, you can use it for maya and houdini. So you can easily send information back and forth and you don't have to buy a "houdini" version of redshift and a "maya version. This also lets you get point velocity motion blur too.
6:30 apparently in Australia different regions had different rail road track gauges, so if you were transporting goods from NS wales to Queensland you had to unload the whole train and reload it onto other trains, this went on till relatively recently
Absolutely love your channel! It’s refreshing to hear a honest appraisal of things as they stand. The whole purpose of any design software is how it lends itself to creators imagination. Totally agree bout the lead wall comment - hopefully someone is listening before it’s too late. Am a beginner and trying to learn and understand Houdini...
Very actual subject. And as you pointed it out specially for small teams / or individual artists. I never thought at the first mentioning of Solaris (2018?) that in the end of 2021 we will have no clue when and how the final production ready Karma will be implemented, IF it will receive a GPU option for fast LD scenes or not and Houdini will float in a stage where the only built in engine option is Mantra with it's still very slow CPU calculation. As an indie artist, I'm waiting for the release of H19 and if there will be no solution to this matter I will consider to start working with Blender. To be honest I even started to generate theories, theories that are kind of like what You mentioned at 29:17 :) As: the connection between Epic and SideFX turned tight nowadays. Not speaking of the huge investment what Epic landed into SideFX a few month ago. I was brainstorming that what SideFX could need from Epic and what they could offer in exchange? Since such connections between dev teams do not appears just due to friendship, specially when a pile of money is also involved. And yea, UE would need a fast placed procedural asset building system and not just on orthodox polygon level, but also for their nanite geos, since as you mentioned due to alembic exporting issues, as Houdini is capable to identify point data, but other 3D DCC tools fall at that task, I can imagine that an engine that would create procedural asset data for nanite based environment would use point data rather than vert. Anyway, it is predictable what SideFX could offer for Epic (specially for UE5) and already introduced with project "Titan" but it is also clear where Houdini is lacking, where Epic with their experience could patch that lack and it is a dedicated LD render engine. Few weeks ago UE released their pathtracer. I asked about VDB rendering options at their introduction stream and the answer was: at the moment it can handle just geometry. (At the moment) I guess you see at what direction my brainstorming bending :) But well, it is wild to call this theory even a daydreaming. But to be honest, in my eye it seems still more rational, that Epic would provide a fast LD engine for Houdini versus a scenario, such; as in a few month FideFX would introduce a super stable Karma with lighting fast GPU option (Arnold GPU was developed for more than 6 years or so?) And yea, probably non of the big studios are missing a fast built in render engine from Houdini, but Houdini Indie is out there for a reason, and their users (including me) are left behind with the existing Mantra / Karma solutions. I tried a lot of 3.party render engines, but there always were some issues, unfortunately always.
So many great points - I think you all should bring up houdini's viewport downsides every video. I regularly switch houdini viewport to its most basic viewing option to avoid all the glitches and artifacts that show up in opengl shaded options.
As much as I love videos for the tech aspect of working in software, ie. the how to's - We all have to step back in the work we do and take a broad look at our own context. Videos like these are gold in that it helps I find for myself at least - serve as an extension and helpfull supplementaton to those considerations I need to think about. Thank you.
27:35 THANK YOU! I thought I was the only maniac coming from Blender to Houdini who absolutely hated those damn gizmo handles. My workflow becomes frustratingly slow trying to select those handles and move/rotate objects instead of just 'g->z->mouse' or 'r->x->78'
Sat with some simulations recently that were for a project in a C4D / Corona workflow. Due to the complexity and size of the data as well as working remotely a data exchange via abc was not on the cards. The most practical solution turned out to be rendering all elements with masks and aovs in Houdini / RS which were then comped into the Corona renders at the studio. All data that was ever exchanged were cameras and proxy geometry used for reflection and refraction passes - and of course exrs. Changes were much easier and quicker to handle this way too. All that said I can't wait for a proper POV-Ray integration into Houdini. :-)
Man! As a Houdini user I can't believe how in synch I am with all that was discussed here I agreed I think with the same issues and opinions very close to 100%! Just an addendum for me today the most lacking feature that I miss in Houdini in first place is a viewport like eevee and in second a GPU render native. Awesome talk!
@@nathanbayne3576 Just the fact that is so easy to see any shader result and procedural texture really close to a render with no sweat is already a big deal for me, in Houdini is so hard that even don't care set it up anymore. I am not even mentioning the performance with all FX volume and lighting at the same time and even a screen space radiosity without having to do hacks I think it is bonus too. Just download it and see for yourself.
A nice talk. Rendering has always been the most dreaded and expensive task for Houdini for sure. Lately, I find myself gravitating towards more to Blender because I know that rendering will be a breeze and quick later on in the steps compared to Houdini... it's like a friend who may be lacking in intelligence and sophistication but is much more around and easygoing to help out with daily chores.
Maybe I'm weird, but I love animating cameras in Houdini especially with CHOPs. That was something I found frustrating when starting to add Blender to my workflow funny enough but totally agree on everything else :)
Very interesting talk! I am one that does everything within Houdini. I agree Solaris is too much work for a small team. I really enjoy lookdev with RS inside Houdini though, but my background is over 15 years of 3dsMax - and there's unfortunately not many things I now think is better in Max.
you guy's always provide a very good insight on the industry, i really like watching your videos. One thing i disagree tho, karma can be used in production, we use it since houdini 18, yes it is / was not as feature rich in the beginning but the only thing it really lacks was usability, i like your nuclear reactor analogy because it applys to karma aswell at the moment. But you can use it in production, we produced around 2000 shots so far with so it works ;) One thing i noticed is that since i am using usd / solaris now for almost 2 years, i cannot imagine going back to previous workflows, it gives you so much freedom and stability in handling data no matter where it comes from that there is no way back once you are used to it
I´m in a similar position with Houdini and Blender, but my opinion on Houdini is different :-) Shading and Rendering in Blender is damn good in my opinion, also Organizing items with Collections and Empy´s is way better and Blender is a lot more stable and direct. But Houdini is just a lot faster with geometry editing bigger meshes, the modeling tools are quite a bit better (Construction Plane, Orientation Picking, Deformers, Assets ect ). Also you have multiple ways Rendering Variations and managing scene complexity with Rops, Tops, Solaris, Takes, Force Objects, Copy Relative Reference ect. But yes Houdini is still overcomplicated for many simple tasks, they get the tools right, but the Workflow is still lacking, yet I really enjoy the flexibility Houdini gives me since you can tackle almost anything
You can pretty much build the while scene in Houdini inside of a single container and then convert it into HDA. Then render that out in Unreal Engine 5. Houdini Engine is very responsive and you have pretty much realtime preview. If you want to render it in Houdini (maybe you have complex FX or volumes) you can still use this method with UE serving as your look dev realtime preview. The viewports are synced as well. You could give that a look!
You do not need to go that far, unreal engine has usd beta since 2 years or so, just export your houdini scene as usd file and load the stage into unreal, i tested it, pretty amazing even supports lights and the principled shader so in theory you can even shade and light in houdini if you so choose to and render it with unreal
I mainly do the first half of my work in Blender and then I bring everything into Houdini to render in Redshift. It's a very fun workflow but the biggest problem is trying have things like shaders be driven by the animation. If I were just using Blender or Maya I could have something like a wrinkle map be activated by a very simple driver. Other than that I find rendering with Redshift in Houdini to be pretty awesome since Redshift uses Houdini's native geometry. It's so cool when you can load in a massive rigid simulation and it takes just a couple of seconds for the render to start.
I have started learning Unreal and I was also tempted by creating the sims in houdini and exported to it. But tried a pop sim with 6mill points using the houdini plugin for unreal and after 10h of creation of the file, Unreal didn’t have the capability to open the 45gb jsonb file. Maybe some other workflow may work
Such an interesting discussion. I will confess to being a bit confused by SideFX's approach to their rendering engines. I am absolutely loving 3Delight at the moment.
Great video, really enjoyed the rant/dabble! I am very curious to know your thoughts on Clarisse, have you ever tried it? It would be very interesting to know what you think.
@@AlbertThomasT its so irritating man. I don't really want to get a redshift subscription but there are a few things that make me think I might have to 😑
Very nice one! Blender + Houdini is a great combo, you did forget about OpenVDB!! And also the cycles/hydra delegate already exist in Houdini (Tangent made it)!
@@workflowinmind Yep,you remembered about Cycles releaze from TA(HdBlackbird )in Houdini. Although for most tasks it is even less applicable now than other renderers,the functionality is very stripped down.
@@umpaumpaify No need to use that tone... And I never stated anything you are writting. Motion blur is working (just not for instances), GPU is also doable using a few patches. I was merely referring to 30:31
I still haven’t tested it yet, but I’m curious what people are thinking about the Redshift RT Beta? Seems like it could serve a similar purpose like Eevee in Blender, although the feedback doesn’t look quite as immediate.
Hi i am beginner in houdini and constantly watch your channel to learn houdini. Can you please talk about in general on how to use karma xpu render engine to render out flip simulation and how to use materials on that? I mostly render my simulation in blender by exporting into alembics but wasn't able to render out white water sim in blender.
As a result of this video and mentions on the forum I thought I would check out the Entagma video on rendering with 3d delight in Houdini (about a year ago in 2020) . However, that video is showing 3D delight in the obj context. With Solaris I find it's much easier in setting up complex lighting so I was hoping to try 3D delight in that context. Their website doesn't seem to indicate a plug-in for the Solaris context although they have a hydra delegate it seems one would have to do some of their own 'building' to incorporate that in Houdini. Any thoughts?
Would you recommend doing level design/worldbuilding in Blender? I come from architecture and am used to designing buildings and small city scenes in Rhino and Unreal but neither Houdini nor Blender seem suited to this. I am exporting these scenes for web so I would like to stay in Blender but it just seems so small and the viewport clipping issues suck and it is difficult to be precise like the snapping that rhino has or the grid snap that unreal has. I am wondering if I should design assets in Blender + Houdini, do the level and environment design in unreal and export as .usd back to blender for optimization and baking for web.
Just wondering--since you mention rendering Houdini stuff on cycles--whether exporting vellum simulations as alembic or fbx from Houdini will give good results in Blender shading? You mention it being a problem, so what's the limitating factor for using Houdini vellum alembics with attributes in blender?
You guys rock 🔥 I just wanted to ask what's your microphone and what is the process of getting this fresh clear voice out of this microphone. PLEASE LET ME KNOW 😅🔥🔥
Wonderfully informative, as always. I'm glad you decided to monetize your videos. PS: Did the discovery on youtube monetizing every video come after my comment on watching adds in your channel on the donut tutorial? ^_^'
While I think exporting geometry to Blender is okay, I've found exporting volumes is painful. In my experience Houdini can simulate volumes faster than Blender can load them, and Houdini's viewport preview with lighting is closer to Mantra's output than using Eevee as a preview for Cycles' output. I'd love to hear your input on Clarisse, Katana, and Guerilla!
Yaay new episode! Oh boy Importing alembic to Blender can be janky if your first alembic frame is empty because it then doesn't recognize the rest of the frames as having any data. It might be a bug or there is maybe a button to fix it.
This is probably not your vibe, but I would love to see a light instancing tutorial. Like: How to efficiently copy a large number of lights to points. Maybe you can find a way to transform this topic into an interesting tutorial if you wanted to. :)
Coming from Max, I learned Houdini using Vray. Took a little getting use to, and since it's still fairly new. Information was sparse. However now I even have my substance nodes preset up for Vray, and I'm getting beautiful results super fast!
Hey guys, Im new to Houdini and 3D software. Should I use Redshift over Mantra or Karma? My cpu is Amd 3990x which is 64 cores and my gpu is rtx 3060(8gb) so my cpu is much better than my normal gpu. So from my understanding, Redshift is gpu render so I think it doesn't make sense for to use Redshift when My cpu is much more faster??? Im so confused because so many Houdini tutorials now use Redshift so i dont know how to follow rendering part of their turorials🥲 Please help me guys😭
You guys are making me learn blender instead of continuing hitting a wall learning Houdini. If you were learning a program from scratch would you still go to houdini?
Ads are fine .. The content matters. Would love to see Entagma's dive in to the new Fields Geo Nodes in blender once its reached feature parity ( Mainly porting Nodes ) with the current state that's based on the earlier concept.
The janitor comment was dead on. I feel like most of my time when at studio that doesn't have a proper back and forth to other DCCs I just fix issues on the back and forth. Solaris I feel like is such a hassle to look dev anything.
The big problem Houdini for rendering is only represents textures in viewport with Mantra, if you use another rendering engine only see a Lambert color, that's horrible, shader GLS is very poorly implemented, if we compare Maya or Blender, the viewport is more flexible in terms of compatibility with other engines.
That's not true at all. The reason the Mantra shaders show up, is they have the OGL tags already added by sidefx. The OGL tag concept means anyone can simply add a tab to their shader, and hook up references to the engines inputs that map to the OGL stuff.
@@fcojavier142 I think you are misunderstanding. It is usually on the render devs side where the tags are set up for their supplied shaders. I’m pointing out that if that’s not the case, you can simply roll it out yourself. Only needs to be done once. My reply was to highlight to you that it indeed works.
What's that bit about camera animation being especially hard in Houdini ? isn't it handled like any other object ? pardon my ignorance, I've hardly touched Houdini and I certainly did no animation in it
It is possible to use HDAs inside Blender using the OpenMfx branch of Blender. You can find it on Github repo by Elie Michel. Also, particles can be imported to Blender using Alembic. For rendering, geometry can be instanced on them using Geometry Nodes. I am not sure which attributes (if any) get imported along with the positions of the particles.
What happens in houdini, stays in Houdini / love Mantra (but hate my 8cores) / hate to have to stay with redshift (but love its speed) // Will we have the luxury to see any time an ad for Mops or Aixponza before content begins ? // make it fun : loud promo voices like the honest trailer guy , logarythmic noises, pulsative edit and pink sneakers ;)
Yeah that's fair if they're doing it anyway, might as well take advantage of it.
2 роки тому
There is a question that perhaps you forgot to put over the table; are the Vfx artist to do creative artist work, just like a Da Vinci painter would do, or is it about doing precise mathematical calculations and pressing buttons? Not that many engineers can be artistics just by knowing what buttons to press.😆
I did rendering in mantra for years and i dont see any renderer that is that flexible (without good c++ knowledge) hack lights , shaders even the pathtracer only with vex and python
6:58 "They are arbitrary", I have never been so offended, right hand rule is the universal rule in physics (according to the definitions of positives). It is NOT arbitrary. It is physical.
Rendering should not be so complicated as having to harness a complete paradigm shift in tech. Unfortunately this is what has become of Karma.... and houdini/karma as a rendering pipeline, has become way too complicated, even for medium level shops (i.e. shops that do commercials and can handle a 20 shot movie sequence). Redshift is so well thought out. It supplies 95% of the solutions with the exceptions of explicit relationships with attributes at render-time.
The main problem in houdini for a simple generalist is lookdev, its a pain in ass. Asset Manager came from 90s, no libraries like in c4d or upcoming Blender 3.0, viewport support for third party renderer is just meh.
From this conversation i got clear that Manuel uses cycles when rendering simple geometry, but iwould like to know what Mo uses, if it's always houdini and RS or Octane or he also jumps to another engines to Look dev. Thanks for the great conversation guys!
Если кто-то из Рашен котанов смотрит это видео, то включите в яндексе с их новым бот переводом голосом. Это весело и это реально прорыв :) да сыроват, но это просто ахуенно
I feel like you guys were really apologetic about entering the monetisation program of youtube but in my opinion, it's something that I do not want you to apologise for. You are making quality free content for everyone and watching an ad is the least I can do to pay you back for it.
Couldn't agree more, take the money
not quality free content, but free quality content :D.... I do agree
True
would love an updated version of this video, great content !
The Z orientation on different programs has a reason. It depends if the 3d program comes from an evolution of an engineer/CAD in mind first or not. 3dsMax was the way to bring Autocad files to 3D. Autocad has the Z up Because... originally people draw the 2d plans of a house in paper in a desk, so the original X/Y was the "Floor". Then thanks to computers the 3d dimension was possible, and this way Z was up. Maya/Houdini has not this background, the X/Y is the plane created in your "monitor" the third dimension is profundity. Blender... I dont know. XD.
Blender also has the Z axis as up. I came from Cinema 4D so this was basically what turned me off to using it.
@@kuunami Blender has Z up because for a long time they thought Max was the measure of all things....
@@kuunami dumb reason
Cant get enough Entagma rambling!
Would love to upvote a camera animation tutorial! It's always been something I've been unsure of how to get right and is so fundamental to many shots. Good moment to also add my thanks for so many hours of invaluable helpful tutorials.
No need to be sorry, guys! Make your money. You deserve it. I think we've all learned a tremendous amount from you both. I don't mind a few ads.
I could listen to these rambles all day
I do vfx in houdini from home at a studio that uses maya. In order to get point attributes/particles into maya, I export out my effects as .rs proxies (since they use redshift, .rs is a redshift scene description file). In maya in the hypershade redshift has userdatascalar/userdatavector that can be plugged into the material. So with that method I am able to get particles/point attributes into maya. Of course the downside is you can't change the geometry once you export to a scene description file. The cool thing about redshift is once you buy a license, you can use it for maya and houdini. So you can easily send information back and forth and you don't have to buy a "houdini" version of redshift and a "maya version. This also lets you get point velocity motion blur too.
6:30
apparently in Australia different regions had different rail road track gauges, so if you were transporting goods from NS wales to Queensland you had to unload the whole train and reload it onto other trains, this went on till relatively recently
Absolutely love your channel! It’s refreshing to hear a honest appraisal of things as they stand. The whole purpose of any design software is how it lends itself to creators imagination. Totally agree bout the lead wall comment - hopefully someone is listening before it’s too late. Am a beginner and trying to learn and understand Houdini...
Very actual subject. And as you pointed it out specially for small teams / or individual artists. I never thought at the first mentioning of Solaris (2018?) that in the end of 2021 we will have no clue when and how the final production ready Karma will be implemented, IF it will receive a GPU option for fast LD scenes or not and Houdini will float in a stage where the only built in engine option is Mantra with it's still very slow CPU calculation. As an indie artist, I'm waiting for the release of H19 and if there will be no solution to this matter I will consider to start working with Blender. To be honest I even started to generate theories, theories that are kind of like what You mentioned at 29:17 :) As: the connection between Epic and SideFX turned tight nowadays. Not speaking of the huge investment what Epic landed into SideFX a few month ago. I was brainstorming that what SideFX could need from Epic and what they could offer in exchange? Since such connections between dev teams do not appears just due to friendship, specially when a pile of money is also involved. And yea, UE would need a fast placed procedural asset building system and not just on orthodox polygon level, but also for their nanite geos, since as you mentioned due to alembic exporting issues, as Houdini is capable to identify point data, but other 3D DCC tools fall at that task, I can imagine that an engine that would create procedural asset data for nanite based environment would use point data rather than vert. Anyway, it is predictable what SideFX could offer for Epic (specially for UE5) and already introduced with project "Titan" but it is also clear where Houdini is lacking, where Epic with their experience could patch that lack and it is a dedicated LD render engine. Few weeks ago UE released their pathtracer. I asked about VDB rendering options at their introduction stream and the answer was: at the moment it can handle just geometry. (At the moment) I guess you see at what direction my brainstorming bending :) But well, it is wild to call this theory even a daydreaming. But to be honest, in my eye it seems still more rational, that Epic would provide a fast LD engine for Houdini versus a scenario, such; as in a few month FideFX would introduce a super stable Karma with lighting fast GPU option (Arnold GPU was developed for more than 6 years or so?) And yea, probably non of the big studios are missing a fast built in render engine from Houdini, but Houdini Indie is out there for a reason, and their users (including me) are left behind with the existing Mantra / Karma solutions. I tried a lot of 3.party render engines, but there always were some issues, unfortunately always.
Never is too much rambling! I would happily listen to a 1.5h podcast ;)
So many great points - I think you all should bring up houdini's viewport downsides every video. I regularly switch houdini viewport to its most basic viewing option to avoid all the glitches and artifacts that show up in opengl shaded options.
As much as I love videos for the tech aspect of working in software, ie. the how to's - We all have to step back in the work we do and take a broad look at our own context. Videos like these are gold in that it helps I find for myself at least - serve as an extension and helpfull supplementaton to those considerations I need to think about. Thank you.
27:35 THANK YOU! I thought I was the only maniac coming from Blender to Houdini who absolutely hated those damn gizmo handles. My workflow becomes frustratingly slow trying to select those handles and move/rotate objects instead of just 'g->z->mouse' or 'r->x->78'
Love these talks, its always good to get into the real world details and headaches of production.
Sat with some simulations recently that were for a project in a C4D / Corona workflow. Due to the complexity and size of the data as well as working remotely a data exchange via abc was not on the cards. The most practical solution turned out to be rendering all elements with masks and aovs in Houdini / RS which were then comped into the Corona renders at the studio. All data that was ever exchanged were cameras and proxy geometry used for reflection and refraction passes - and of course exrs. Changes were much easier and quicker to handle this way too.
All that said I can't wait for a proper POV-Ray integration into Houdini. :-)
Man! As a Houdini user I can't believe how in synch I am with all that was discussed here I agreed I think with the same issues and opinions very close to 100%! Just an addendum for me today the most lacking feature that I miss in Houdini in first place is a viewport like eevee and in second a GPU render native. Awesome talk!
what is it about the viewport in eevee that you enjoy? Just curious!
@@nathanbayne3576 Just the fact that is so easy to see any shader result and procedural texture really close to a render with no sweat is already a big deal for me, in Houdini is so hard that even don't care set it up anymore. I am not even mentioning the performance with all FX volume and lighting at the same time and even a screen space radiosity without having to do hacks I think it is bonus too. Just download it and see for yourself.
@@mzigaib I have used blender quite a bit, I’m just always curious to hear opinions, Thanks!
you guys are really heroes of Houdini.thanks to you I found the courage to study houdini. PS: the audio quality is excellent but has some lag ;)
You guys are funny, I really like your honesty. ahah. Hats down for everything you do for the Houdini community.
Loved the conversation! Taking geo and volumes to other DCCs for production is such a pain!
Would love more 3delight tutorials, I've been playing around with it and love it so far.
26:27 I’m relatively new to Houdini but I really love the camera. It’s seems to be better than most….
A nice talk. Rendering has always been the most dreaded and expensive task for Houdini for sure. Lately, I find myself gravitating towards more to Blender because I know that rendering will be a breeze and quick later on in the steps compared to Houdini... it's like a friend who may be lacking in intelligence and sophistication but is much more around and easygoing to help out with daily chores.
Maybe I'm weird, but I love animating cameras in Houdini especially with CHOPs. That was something I found frustrating when starting to add Blender to my workflow funny enough but totally agree on everything else :)
Very interesting talk! I am one that does everything within Houdini. I agree Solaris is too much work for a small team. I really enjoy lookdev with RS inside Houdini though, but my background is over 15 years of 3dsMax - and there's unfortunately not many things I now think is better in Max.
you guy's always provide a very good insight on the industry, i really like watching your videos.
One thing i disagree tho, karma can be used in production, we use it since houdini 18, yes it is / was not as feature rich in the beginning but the only thing it really lacks was usability, i like your nuclear reactor analogy because it applys to karma aswell at the moment.
But you can use it in production, we produced around 2000 shots so far with so it works ;)
One thing i noticed is that since i am using usd / solaris now for almost 2 years, i cannot imagine going back to previous workflows, it gives you so much freedom and stability in handling data no matter where it comes from that there is no way back once you are used to it
I´m in a similar position with Houdini and Blender, but my opinion on Houdini is different :-)
Shading and Rendering in Blender is damn good in my opinion, also Organizing items with Collections and Empy´s is way better and Blender is a lot more stable and direct.
But Houdini is just a lot faster with geometry editing bigger meshes, the modeling tools are quite a bit better (Construction Plane, Orientation Picking, Deformers, Assets ect ). Also you have multiple ways Rendering Variations and managing scene complexity with Rops, Tops, Solaris, Takes, Force Objects, Copy Relative Reference ect. But yes Houdini is still overcomplicated for many simple tasks, they get the tools right, but the Workflow is still lacking, yet I really enjoy the flexibility Houdini gives me since you can tackle almost anything
Thank U again Guys !!!
I would love longer podcast as well!!
You can pretty much build the while scene in Houdini inside of a single container and then convert it into HDA. Then render that out in Unreal Engine 5. Houdini Engine is very responsive and you have pretty much realtime preview. If you want to render it in Houdini (maybe you have complex FX or volumes) you can still use this method with UE serving as your look dev realtime preview. The viewports are synced as well. You could give that a look!
You do not need to go that far, unreal engine has usd beta since 2 years or so, just export your houdini scene as usd file and load the stage into unreal, i tested it, pretty amazing even supports lights and the principled shader so in theory you can even shade and light in houdini if you so choose to and render it with unreal
@@hannesreindl5714 Sounds cool! are there any video tutorial on this entire workflow?
I mainly do the first half of my work in Blender and then I bring everything into Houdini to render in Redshift. It's a very fun workflow but the biggest problem is trying have things like shaders be driven by the animation. If I were just using Blender or Maya I could have something like a wrinkle map be activated by a very simple driver. Other than that I find rendering with Redshift in Houdini to be pretty awesome since Redshift uses Houdini's native geometry. It's so cool when you can load in a massive rigid simulation and it takes just a couple of seconds for the render to start.
I have started learning Unreal and I was also tempted by creating the sims in houdini and exported to it. But tried a pop sim with 6mill points using the houdini plugin for unreal and after 10h of creation of the file, Unreal didn’t have the capability to open the 45gb jsonb file. Maybe some other workflow may work
you guy are great thanks for your tut
Such an interesting discussion. I will confess to being a bit confused by SideFX's approach to their rendering engines. I am absolutely loving 3Delight at the moment.
Great video, really enjoyed the rant/dabble! I am very curious to know your thoughts on Clarisse, have you ever tried it? It would be very interesting to know what you think.
I can never seem to get motion blur to work exporting alembic to Blender, anyone have any suggestions?
Same problem for me too
@@AlbertThomasT its so irritating man. I don't really want to get a redshift subscription but there are a few things that make me think I might have to 😑
Very nice one!
Blender + Houdini is a great combo, you did forget about OpenVDB!!
And also the cycles/hydra delegate already exist in Houdini (Tangent made it)!
Are u kidding?
No adaptive subdive,no motion blur,no OSL support,no Nurbs and finally-NO GPU support.
@@umpaumpaify are you talking to me?!
@@workflowinmind Yep,you remembered about Cycles releaze from TA(HdBlackbird
)in Houdini.
Although for most tasks it is even less applicable now than other renderers,the functionality is very stripped down.
@@workflowinmind There is not a single outstanding plus to use it in its current cpu form,it is better to choose 3Delight for example
@@umpaumpaify No need to use that tone... And I never stated anything you are writting.
Motion blur is working (just not for instances), GPU is also doable using a few patches.
I was merely referring to 30:31
useful information. Thanks a lot
I wanted to cry when you appreciated blender!
I still haven’t tested it yet, but I’m curious what people are thinking about the Redshift RT Beta? Seems like it could serve a similar purpose like Eevee in Blender, although the feedback doesn’t look quite as immediate.
Hi i am beginner in houdini and constantly watch your channel to learn houdini. Can you please talk about in general on how to use karma xpu render engine to render out flip simulation and how to use materials on that? I mostly render my simulation in blender by exporting into alembics but wasn't able to render out white water sim in blender.
Oh no ! These awesome tutorial makers want to have money for their hard work, how dare they !!! Love your stuff, keep it up.
As a result of this video and mentions on the forum I thought I would check out the Entagma video on rendering with 3d delight in Houdini (about a year ago in 2020) . However, that video is showing 3D delight in the obj context. With Solaris I find it's much easier in setting up complex lighting so I was hoping to try 3D delight in that context. Their website doesn't seem to indicate a plug-in for the Solaris context although they have a hydra delegate it seems one would have to do some of their own 'building' to incorporate that in Houdini. Any thoughts?
Would you recommend doing level design/worldbuilding in Blender? I come from architecture and am used to designing buildings and small city scenes in Rhino and Unreal but neither Houdini nor Blender seem suited to this. I am exporting these scenes for web so I would like to stay in Blender but it just seems so small and the viewport clipping issues suck and it is difficult to be precise like the snapping that rhino has or the grid snap that unreal has. I am wondering if I should design assets in Blender + Houdini, do the level and environment design in unreal and export as .usd back to blender for optimization and baking for web.
what about Clarisse is it better ? and why not suggest FBX 😅?
which rendering in houdini do you use more often in commercial projects? I use Houdini + redshift .
Just wondering--since you mention rendering Houdini stuff on cycles--whether exporting vellum simulations as alembic or fbx from Houdini will give good results in Blender shading? You mention it being a problem, so what's the limitating factor for using Houdini vellum alembics with attributes in blender?
You guys rock 🔥 I just wanted to ask what's your microphone and what is the process of getting this fresh clear voice out of this microphone. PLEASE LET ME KNOW 😅🔥🔥
Shure sm7b - we just hook it up to the streaming box. That's pretty much it :) Cheers, Mo
Wonderfully informative, as always. I'm glad you decided to monetize your videos. PS: Did the discovery on youtube monetizing every video come after my comment on watching adds in your channel on the donut tutorial? ^_^'
Indeed, your comment started us looking into that. Cheers, Mo
While I think exporting geometry to Blender is okay, I've found exporting volumes is painful. In my experience Houdini can simulate volumes faster than Blender can load them, and Houdini's viewport preview with lighting is closer to Mantra's output than using Eevee as a preview for Cycles' output.
I'd love to hear your input on Clarisse, Katana, and Guerilla!
Yaay new episode!
Oh boy Importing alembic to Blender can be janky if your first alembic frame is empty because it then doesn't recognize the rest of the frames as having any data. It might be a bug or there is maybe a button to fix it.
Hah. Try importing Alembic into Unreal without dying inside.
This is probably not your vibe, but I would love to see a light instancing tutorial.
Like: How to efficiently copy a large number of lights to points.
Maybe you can find a way to transform this topic into an interesting tutorial if you wanted to. :)
Coming from Max, I learned Houdini using Vray. Took a little getting use to, and since it's still fairly new. Information was sparse.
However now I even have my substance nodes preset up for Vray, and I'm getting beautiful results super fast!
Is gaffer a legitimate contender in the lookdev category?
Really wonder what Karma GPU will mean for Redshift users once it’s ready.
finally some ranting! =) although mild mannered.
what is your guys take on renderman(24) in houdini?
Hey guys, Im new to Houdini and 3D software.
Should I use Redshift over Mantra or Karma?
My cpu is Amd 3990x which is 64 cores and my gpu is rtx 3060(8gb) so my cpu is much better than my normal gpu.
So from my understanding, Redshift is gpu render so I think it doesn't make sense for to use Redshift when My cpu is much more faster???
Im so confused because so many Houdini tutorials now use Redshift so i dont know how to follow rendering part of their turorials🥲
Please help me guys😭
Karma cpu
are you guys planning on doing tutorials for c4d?
Not in the near future, I'm afraid. Cheers, Mo
Love the chat❤
You guys are making me learn blender instead of continuing hitting a wall learning Houdini. If you were learning a program from scratch would you still go to houdini?
Absolutely. Without hesitation. Cheers, Manu & Mo
Ads are fine .. The content matters. Would love to see Entagma's dive in to the new Fields Geo Nodes in blender once its reached feature parity ( Mainly porting Nodes ) with the current state that's based on the earlier concept.
The janitor comment was dead on. I feel like most of my time when at studio that doesn't have a proper back and forth to other DCCs I just fix issues on the back and forth. Solaris I feel like is such a hassle to look dev anything.
The big problem Houdini for rendering is only represents textures in viewport with Mantra, if you use another rendering engine only see a Lambert color, that's horrible, shader GLS is very poorly implemented, if we compare Maya or Blender, the viewport is more flexible in terms of compatibility with other engines.
That's not true at all. The reason the Mantra shaders show up, is they have the OGL tags already added by sidefx.
The OGL tag concept means anyone can simply add a tab to their shader, and hook up references to the engines inputs
that map to the OGL stuff.
@@lewisvtaylor For this reason, because it is very tedious to adjust all the OGL Tags by hand, when in other software it is with a simple Click.
@@fcojavier142 I think you are misunderstanding. It is usually on the render devs side where the tags are set up for their supplied shaders. I’m pointing out that if that’s not the case, you can simply roll it out yourself. Only needs to be done once.
My reply was to highlight to you that it indeed works.
Rendering in Houdini is a big fear lol, one day...
What's that bit about camera animation being especially hard in Houdini ? isn't it handled like any other object ? pardon my ignorance, I've hardly touched Houdini and I certainly did no animation in it
It is possible to use HDAs inside Blender using the OpenMfx branch of Blender. You can find it on Github repo by Elie Michel.
Also, particles can be imported to Blender using Alembic. For rendering, geometry can be instanced on them using Geometry Nodes. I am not sure which attributes (if any) get imported along with the positions of the particles.
This video might need to be re-recorded because of Karma in H19 :D
What do you think about karma now -in 2024?
What happens in houdini, stays in Houdini /
love Mantra (but hate my 8cores) / hate to have to stay with redshift (but love its speed)
// Will we have the luxury to see any time an ad for Mops or Aixponza before content begins ?
// make it fun : loud promo voices like the honest trailer guy , logarythmic noises, pulsative edit and pink sneakers ;)
im gonna invest in Octane for now 🙌🏿 even tho my name is Arnold
Yeah that's fair if they're doing it anyway, might as well take advantage of it.
There is a question that perhaps you forgot to put over the table; are the Vfx artist to do creative artist work, just like a Da Vinci painter would do, or is it about doing precise mathematical calculations and pressing buttons? Not that many engineers can be artistics just by knowing what buttons to press.😆
ablocker is a friend
Just wondering
I left redshift for 3dlight, and not going back. This render engine is insane, killer IPR, snappy and fast, try the free demo you wont regret it
I'll try this out. Would i need to edit the ENV file for houdni?
@@Helios.vfx. no install and use, it will install a config in the package folder
@@TTROPVNR Thank you bud!
@@Helios.vfx. they have a discord
@@TTROPVNR Entagma or the 3delight folks?
maybe missed rendering in game engines like Unreal?
I did rendering in mantra for years and i dont see any renderer that is that flexible (without good c++ knowledge) hack lights , shaders even the pathtracer only with vex and python
Yes you should render in Houdini. It’ll free up hours of spare time to get out of the house :)
didn't tangent animation release hdCycles, its open-source Cycles Hydra render delegate for houdini?
Are u kidding?
No adaptive subdive,no motion blur,no OSL support,no Nurbs and finally-NO GPU support.
6:58 "They are arbitrary", I have never been so offended, right hand rule is the universal rule in physics (according to the definitions of positives). It is NOT arbitrary. It is physical.
we should start a campaign to get Houdini render engines faster!
Manuel, your a budding Blender Fanboy, embrace it, don't apologize. 😂
Rendering should not be so complicated as having to harness a complete paradigm shift in tech. Unfortunately this is what has become of Karma.... and houdini/karma as a rendering pipeline, has become way too complicated, even for medium level shops (i.e. shops that do commercials and can handle a 20 shot movie sequence).
Redshift is so well thought out. It supplies 95% of the solutions with the exceptions of explicit relationships with attributes at render-time.
Make these podcasts longer not shorter!!!!
Why would you be sorry for adds? Should you not be paid. Must be a cultural thing, enjoy your money you earned it dude.
The main problem in houdini for a simple generalist is lookdev, its a pain in ass. Asset Manager came from 90s, no libraries like in c4d or upcoming Blender 3.0, viewport support for third party renderer is just meh.
sorry to resurrect this post but what do you think of the new asset manager that was presented for H19?
From this conversation i got clear that Manuel uses cycles when rendering simple geometry, but iwould like to know what Mo uses, if it's always houdini and RS or Octane or he also jumps to another engines to Look dev. Thanks for the great conversation guys!
Houdini with Redshift or Octane it is for me. Cheers, Mo
I want you to monetize your videos. You guys deserve it.
just use adblocker, what's the deal?
im only 200 hours in... what is the summary. I like karma. redshift didnt work. V-ray looks promising. and im about to try octane
Houdini engine for blender would solve so many of these problems!
i have to say, Gaffer is closer to houdini workflow then Solaris to houdini.
cool god like u
Nerd Rumble !
YT Adds - who cares. I Hope you get something out of it.
Cheers and thank you for your insights.
you are not forced to watch adds but to buy you tube premium if you want to watch without adds
В двух слова, что они там решили? :)
POV-Ray.
Why are you guys sorry for ads
Если кто-то из Рашен котанов смотрит это видео, то включите в яндексе с их новым бот переводом голосом. Это весело и это реально прорыв :) да сыроват, но это просто ахуенно
nevertheless alembic is a wide door always open ... nevertheless)))