Thank you! I'm a blender user and proud embergen suite owner....i love this video and i Hope a lot videos likes this one Will come in the future..i truly want to thank you and jangaFx!!!!! ❤❤
Love Embergen but the whole timestep/framestride algorithm thing needs a rethink. It's crazy that that with all the advancements of current gen tech that a program just "doesn't do well" with fast moving objects. If you know that then start right there. Make it a priority to circumvent that specific issue in a future update and make it so easy it can literally be an afterthought. Not something that completely destroys something you've made because you didn't MATH properly at the start. Seem like this entire process was an after thought or at the very least it was putting band aid on a bullet wound.
Not quite correct. It is not a simple task in the slightest when programming on the GPU. Is it a bandaid? Yes. Are we working on actually fixing it? Yes, but it will require a lot of work. It is of course a priority, but won't be addressed until EmberGen 2.0 when we have a full sparse domain and all of our new solver technology ported to that new system. We're just as frustraited with the solution we have, however for now, with the knowledge, skills, programming prowess, and current codebase, there isn't much we can do minus rewrite embergen entirely from scratch, which is what we're doing for 2.0.
@@sebastianblatter7718 Our goal has been to have a 2.0 beta sometime this year. Official 2.0 release that's production ready is still unknown. The beta may not make it this year either as we are pouring focus into LiquiGen and some other tooling to aid in 2.0 development.
Thank you for this very complete tutorial. In blender, I made a simple animation in 24 fps: a sphere moves on the Z axis while accelerating (using the exponential interpolation mode). I export to fbx and load it into Embergen. I apply the rule: Override FPS * Frame Stride = TimeStep with override fps set to 24, frame Stride to 2 and timeStep to 48hz. The vdb is not synchronized with the animation under blender. I tried the same animation in 30 fps under blender, with, under embergen, override fps set to 30, frame Stride to 2 and timeStep to 60hz. The vdb is still not synchronized with my original animation in Blender. What am I doing wrong? THANKS !
A note about sensor fit: It will use the values below, so when you set it to vertical you can just adjust that value. Changing the Focal Length would be a workaround and may not give the exact result.
I have a sequence with embergen basic fire and smoke simulations, but I want it in transparent ..the fire isn't showing up in the renders ...any way/ideas to work on it ?
This is great, have always been missing the correct shading in Blender especially on fire. Do you happen to know is adaptive grids/bounding boxes will ever be coming to embergen?
Such a great workflow for the shading on Blender! I was struggling with this quit a lot, and just combining both using After Effect. Amazing stuff! I'm still a bit confused regarding the FPS. If I push a 24fps fbx to Embergen, I need them to adapt frame stride to 2.5 (default 60) and when I import back to Blender everything will match?
Set your override FPS to 60 and leave the Frame Stride at 1! or you can change the Override FPS value to 30 and Frame Stride to 2. Can't do half steps for Frame Stride
Unfortunately no because EmberGen only exports particle points and need to be instanced in Blender. We're also finding bugs with alembic sequence imports with Blender, and we don't know if its a bug on our side or on their side, so it's very difficult to troubleshoot atm 😅
when I bring in my camera which was exported at 24FPS and has 470 key frames of motion and the whole sequence export is 500 frames. When i bring it into Embergen and use this formula it doesnt matter the camera motion extends over 1000 frames... i set the time stride to 3 and timestep 72 and the override FPS to 24...it still animates the camera over 1000 frames...I am banging my head against the wall trying to figure this out...
the coding required for this would be very intense, but as long as you have a dedicated folder and naming structure, you should always be able to open your blender project and have it load whatever the latest assets are
How can I get an animated vertexmap to embergen? Right now, only static FBX seems to work. I still use the free version that comes with my Octane license, so maybe you have already fixed that in a later version?
I have a scene in Blender where I have perfectly aligned a 3D building to the real building, so it is overlaid onto it in the Blender render. But then, when I bring the same image sequence background into EmberGen as a backplate (set to 60fps), the model is no longer aligned with the video. I have my exported FBX camera frame rate overridden to 60fps also. I can't seem to figure out a solution and am beginning to think it may be a setting in Blender itself. Please help, I'm losing my mind!
I still have a huge doubt, Like can't I export the file the way it is in Embergen? the simulation with the gradient and all? Lighting we can recreate in blender yeah ok, Is there any way ? Or VBD's should manually be shaded? Also what about GPU particles and their gradients? like what if i used a random per particle for color and i want the same in Blender?
Hi Yoshi, VDB's need to be manually shaded, the format itself cannot pass along shading information (other than color if you have more advanced packing methods). But even listing the voxels color ultimately does not determine the shading, that's up to the renderer. In this (odd) method presented in the video, you get the best of both worlds. you can absolutely export it the way it is in EmberGen, you would just need to learn how to composite it. I.e Export your camera from blender, do your lighting in EmberGen to match the scene, then layer the images into your scene, i.e how the movies do it.
@@댕굴DangGuRu So you have the Blender Render, EmberGen Alpha Channel export, and the EmberGen Render All export. If you pause at 27:28, I'm using 2 mix Channel nodes, the first is set to Mix and the 2nd is set to Screen. Kinda hard describing the connections but you should be able to mimic what i have based on whats on the screen. Hope that helps!
hello, i could need some help , is it possible to use an imported geometry only as an Effector und not as an Emittor. I want to manipulate an fire emittor with an imported Geometry but the Geometry himself should not burn. but rather it should effect the emittor with his geometry, i hope this was clear, my english is ugly sorry, , sadly i cant find an option like that. In Blender forexample, when you start a simulation, you can us an Geometry as an Effector inside the Domain Box. I really need this Option in Embergen. Someone can help pls ?
I think you want to use this imported mesh as a collider, that way it wont emit smoke or fire but it will be able to interact with it. Does that help your case?
thanks for the tutorial but the reason people have been using vdb shading instead of your x3 method for more than 15 years now is quite obvious. Your method is projecting texture over a meshed vdb - thus the effect is being rendered as a mesh. This ofc looses all advantages the VOLUME has like gradual depth accumulation of the vdb volume - like if you have a hand poking inside the fire / smoke with your method there will be either the hand is 100% infront of the fire or 100% behind of the meshed fire - while with proper vdb shading you will have the hand being inside the volume so closer to the front towards the camera rays your mesh is the more you will see, while the deeper the hand is in the volume the less of it will be visible... This method might be useful for meshlight - only for the lighting pass, but definitely not a good idea in most cases for rendering the actual fluid.
This was just an alternative method in situations where you don't need a hand poking through a volume for gradual depth, you don't have to follow our ideas. Thanks for the input though
This time step, frame stride thing is unfortunately completely befuddling. I know you are trying, but really, that's complete nonsense. Byzantine is the only way to describe this work around. Yikes.
I don’t mean to be a jerk here but this seems like something that should have been solved with a slider during alpha or beta tests. I know, it’ll be built into the next one, but it really is a shame. I followed this whole tutorial excitedly and you completely and utterly lost me at that point. I have faith it’ll be fixed eventually.
That volume to mesh is exactly what I need
I have been missing this specific workflow for a year 😂 Thank you and congratulations on a banger.
That thing with Blender changing the scene's framerate to sync with the last-imported fbx files screwed me over so hard once.
I still don't understand it ;U;
Thank you! I'm a blender user and proud embergen suite owner....i love this video and i Hope a lot videos likes this one Will come in the future..i truly want to thank you and jangaFx!!!!! ❤❤
I have been waiting for this one!!!
Absolute legend. Thank you for this workflow.
Super cool tutorial, thank you a lot!
Wow! I learned so many new techniques, thank you!
Method looks promising. Thanks, Will!
Will always killing it! much love to the team #JangaTakeover
I was exactly looking for a tutorial like this, nice timing lol
Love Embergen but the whole timestep/framestride algorithm thing needs a rethink. It's crazy that that with all the advancements of current gen tech that a program just "doesn't do well" with fast moving objects. If you know that then start right there. Make it a priority to circumvent that specific issue in a future update and make it so easy it can literally be an afterthought. Not something that completely destroys something you've made because you didn't MATH properly at the start. Seem like this entire process was an after thought or at the very least it was putting band aid on a bullet wound.
Not quite correct. It is not a simple task in the slightest when programming on the GPU. Is it a bandaid? Yes. Are we working on actually fixing it? Yes, but it will require a lot of work. It is of course a priority, but won't be addressed until EmberGen 2.0 when we have a full sparse domain and all of our new solver technology ported to that new system. We're just as frustraited with the solution we have, however for now, with the knowledge, skills, programming prowess, and current codebase, there isn't much we can do minus rewrite embergen entirely from scratch, which is what we're doing for 2.0.
@@jangafx is there an ETA for 2.0 already? just curious. 1.1 is a damn great step into the right direction
@@sebastianblatter7718 Our goal has been to have a 2.0 beta sometime this year. Official 2.0 release that's production ready is still unknown. The beta may not make it this year either as we are pouring focus into LiquiGen and some other tooling to aid in 2.0 development.
amazing video, thx so much!
Super useful tutorial! Keep up the good work!
Great quality tutorial, thank you!
Partner with professional blender user and let them do the tutorial better. Ember will become more popular. More sales and more productivity.
it helped alot thanks
Thank you for this very complete tutorial. In blender, I made a simple animation in 24 fps: a sphere moves on the Z axis while accelerating (using the exponential interpolation mode). I export to fbx and load it into Embergen. I apply the rule: Override FPS * Frame Stride = TimeStep with override fps set to 24, frame Stride to 2 and timeStep to 48hz. The vdb is not synchronized with the animation under blender. I tried the same animation in 30 fps under blender, with, under embergen, override fps set to 30, frame Stride to 2 and timeStep to 60hz. The vdb is still not synchronized with my original animation in Blender. What am I doing wrong?
THANKS !
A note about sensor fit: It will use the values below, so when you set it to vertical you can just adjust that value. Changing the Focal Length would be a workaround and may not give the exact result.
I love you Will
Can you please make a tutorial on how to composite this in real life footage💯
When i get confident in my compositing workflow 😅
god tier workflow, ty
I have a sequence with embergen basic fire and smoke simulations, but I want it in transparent ..the fire isn't showing up in the renders ...any way/ideas to work on it ?
The formula of FPS * Frame Stride => timestep.... it would be great if these types of things were specified in tooltips or somewhere in the UI.
would you please do a tutorial on how to export particles .abc to blender, and read their export size attributes from embergen. Thank you
we've found some bugs with regards to alembic attributes on our side. eta when this will be fixed is unknown atm but we are working on it!
Thank you
mapping my png to the mesh does not work i see some fire but the png is flat and the mesh is 3d it doesn't align whatsoever
Override FPS * Frame Stride = TimeStep
Can I get a pin on this lmao. Keep coming back
@@sirico8987 So for 24fps fbx.Import tab you set the timestep to 48Hz?
This is great, have always been missing the correct shading in Blender especially on fire.
Do you happen to know is adaptive grids/bounding boxes will ever be coming to embergen?
We have something even better than adaptive grids in the works! EmberGen 2.0 will contain these new features.
Excited to see what you guys got in store :)
This is urgently needed,l like embergen!!!@@jangafx
Such a great workflow for the shading on Blender! I was struggling with this quit a lot, and just combining both using After Effect. Amazing stuff!
I'm still a bit confused regarding the FPS. If I push a 24fps fbx to Embergen, I need them to adapt frame stride to 2.5 (default 60) and when I import back to Blender everything will match?
Set your override FPS to 60 and leave the Frame Stride at 1! or you can change the Override FPS value to 30 and Frame Stride to 2. Can't do half steps for Frame Stride
Very interesting way to accomplish this results. What about particles ? You found out any solution for that ?
Unfortunately no because EmberGen only exports particle points and need to be instanced in Blender. We're also finding bugs with alembic sequence imports with Blender, and we don't know if its a bug on our side or on their side, so it's very difficult to troubleshoot atm 😅
We need round trip from EmberGen to unreal with particles include blender if needed please.
when I bring in my camera which was exported at 24FPS and has 470 key frames of motion and the whole sequence export is 500 frames. When i bring it into Embergen and use this formula it doesnt matter the camera motion extends over 1000 frames... i set the time stride to 3 and timestep 72 and the override FPS to 24...it still animates the camera over 1000 frames...I am banging my head against the wall trying to figure this out...
Great info Will! Would the UV projection flames also work for rendering a reflection pass in Blender?
Good question, i don't think it will since it's invisible to the camera but I'm not really sure..
I really wish there is a livelink between Blender and EmberGen than can update on both side when you are making simulations.
the coding required for this would be very intense, but as long as you have a dedicated folder and naming structure, you should always be able to open your blender project and have it load whatever the latest assets are
How can I get an animated vertexmap to embergen? Right now, only static FBX seems to work. I still use the free version that comes with my Octane license, so maybe you have already fixed that in a later version?
We only have gradients and you can animate the gradient value. Animated maps aren't supported unless its with a gradient.
😱👏👏👏
what mode do you have the second blending mode set to at the end when you are compositing?
waiting for version 11.9 so we can have it more simple way :)) common guys you can make it
I have a scene in Blender where I have perfectly aligned a 3D building to the real building, so it is overlaid onto it in the Blender render. But then, when I bring the same image sequence background into EmberGen as a backplate (set to 60fps), the model is no longer aligned with the video. I have my exported FBX camera frame rate overridden to 60fps also. I can't seem to figure out a solution and am beginning to think it may be a setting in Blender itself.
Please help, I'm losing my mind!
If you were the one posting in our forums we solved the issue by zeroing out the Camera X / Y Shift 💪
I still have a huge doubt, Like can't I export the file the way it is in Embergen? the simulation with the gradient and all? Lighting we can recreate in blender yeah ok, Is there any way ? Or VBD's should manually be shaded? Also what about GPU particles and their gradients? like what if i used a random per particle for color and i want the same in Blender?
Hi Yoshi,
VDB's need to be manually shaded, the format itself cannot pass along shading information (other than color if you have more advanced packing methods). But even listing the voxels color ultimately does not determine the shading, that's up to the renderer. In this (odd) method presented in the video, you get the best of both worlds.
you can absolutely export it the way it is in EmberGen, you would just need to learn how to composite it. I.e Export your camera from blender, do your lighting in EmberGen to match the scene, then layer the images into your scene, i.e how the movies do it.
@@jangafx yeahh i understood thanks a lot
How to export particles .abc to blender,This is my most important question! Thx😭
Nice Tutor who this is ?
iss me :)
@@notwillclarke thank's willclarke
This frame stride time thing is still confusing to me and you didn’t explain properly
What is confusing about Override FPS * Frame Stride = TimeStep?
I didn't understand the synthesis part in the blender. What is the node configuration?😂
What is the synthesis party you're talking about? Do you mean in compositing them together?
@@notwillclarke It's from 27:00. I don't know how to set up the composition node.
@@댕굴DangGuRu So you have the Blender Render, EmberGen Alpha Channel export, and the EmberGen Render All export. If you pause at 27:28, I'm using 2 mix Channel nodes, the first is set to Mix and the 2nd is set to Screen. Kinda hard describing the connections but you should be able to mimic what i have based on whats on the screen. Hope that helps!
hello, i could need some help , is it possible to use an imported geometry only as an Effector und not as an Emittor. I want to manipulate an fire emittor with an imported Geometry but the Geometry himself should not burn. but rather it should effect the emittor with his geometry, i hope this was clear, my english is ugly sorry, , sadly i cant find an option like that. In Blender forexample, when you start a simulation, you can us an Geometry as an Effector inside the Domain Box. I really need this Option in Embergen. Someone can help pls ?
I think you want to use this imported mesh as a collider, that way it wont emit smoke or fire but it will be able to interact with it. Does that help your case?
I guess we're not at the point of not needing to fake emission with a point light yet 😅
liquigen when
February 12 - Closed alpha for suite owners
But the smokes are not visible with this workflow what about it....
thanks for the tutorial but the reason people have been using vdb shading instead of your x3 method for more than 15 years now is quite obvious. Your method is projecting texture over a meshed vdb - thus the effect is being rendered as a mesh. This ofc looses all advantages the VOLUME has like gradual depth accumulation of the vdb volume - like if you have a hand poking inside the fire / smoke with your method there will be either the hand is 100% infront of the fire or 100% behind of the meshed fire - while with proper vdb shading you will have the hand being inside the volume so closer to the front towards the camera rays your mesh is the more you will see, while the deeper the hand is in the volume the less of it will be visible... This method might be useful for meshlight - only for the lighting pass, but definitely not a good idea in most cases for rendering the actual fluid.
This was just an alternative method in situations where you don't need a hand poking through a volume for gradual depth, you don't have to follow our ideas. Thanks for the input though
Wrong coordinates, wrong frame....embergen is driving me crazy
This time step, frame stride thing is unfortunately completely befuddling. I know you are trying, but really, that's complete nonsense. Byzantine is the only way to describe this work around. Yikes.
I thought the equation was simple enough, but point taken
I don’t mean to be a jerk here but this seems like something that should have been solved with a slider during alpha or beta tests. I know, it’ll be built into the next one, but it really is a shame. I followed this whole tutorial excitedly and you completely and utterly lost me at that point. I have faith it’ll be fixed eventually.
can you do a car tyre bunout in embergen? and export to blender