EmberGen Tutorial: Importing Meshes and Cameras

Поділитися
Вставка
  • Опубліковано 23 лис 2024

КОМЕНТАРІ • 30

  • @jakepogson8618
    @jakepogson8618 22 дні тому +1

    Awesome tutorial!
    Quick question, at the start you scaled the imported meshes to EG, does this not mess up the proportions in your camera view when you export or is that scaled relative too?

  • @KingKong19100
    @KingKong19100 5 місяців тому +3

    Absolutely amazing video!!!! I have been asking for this EXACT workflow for like a year now and you guys answered all my questions! Great stuff. You have no idea how much this is gonna help people who can’t afford to set up renderfarm with massive VDB’s. Next best thing? Frame matching and rendering straight out of EmberGen and comping back into our og shots. The holdouts and everything make perfect sense! This is gonna be such an amazing tool and this video is gonna help people TREMENDOUSLY!!!
    I do have a couple ideas for either tutorials or videos you guys could possibly make that don’t have to be too long but would definitely help everybody out imo.
    1. Importing Alembics:
    - I haven’t done this yet but I’m hoping this is something that’s possible? It could genuinely save so much more time and space when it comes to baking out our rigs since that’s just straight final mesh.
    2. Importing Lights:
    - I don’t know if this is possible yet in EmberGen but one thing I think would be tremendous is importing lights! I would love to match my Blender light settings with EmberGen’s scene so that my comped results could look closer. This could involve setting a custom HDRI with the same coordinates, importing the sun lamp, and maybe even bringing in point and area lights. If there’s some tool that can convert them to embergens settings that could be cool.
    3. Render Passes:
    - I would love to use this workflow you’ve mentioned in this video involving compositing. I think it would be amazing if we got a quick detailed video showing how to render out our Fire and Smoke to have complete control, how to render the emission passes and any that relate to them involving things like maybe shadow/AO passes, volume direct and indirect passes, any lighting passes, just what can give us the best kinds of control in the compositing stage to help make our results more believable! Here’s something to ponder over, but what if you could even render a Normals pass? Or some kind of density or depth pass? Could help in the comp stage a lot! But after you knocked it out of the park with this video I think this idea might be the next best part!
    4. Underwater Scenes
    - This one might sound weird but Jason Key did an INCREDIBLE “underwater” scene where he simulated what looked like bubbles and cool effects giving the illusion of being underwater. Why not take this a step further? What if you made an underwater video just showing how to almost simulate massive scale water simulations and shading the smoke and it’s physics to look like water? GPU particles could come in handy there as well. Then there’s the underwater part that Jason did. How to simulate actually being underwater and having a character emit bubbles through masked sections of the character and whatnot.
    That’s all I got for now. You guys are killing it!!! I can’t wait to mess with this and GeoGen!!!

  • @ezr3alfrost
    @ezr3alfrost 5 місяців тому +1

    Excellent Video! This should be out early. I was struggling with a project few weeks back and had to render out “unnaturally” for my CG plate.
    There’s another issue I faced, embergen does not support camera with target. I had to create separate camera and use constraints to link it to the original camera - bake its animation on another and import it to get that shot working. A weird approach tbh.😢
    Another thing you can add is possibility of linking simulation domain to mesh or something so it can move and follow the object.

  • @Aggredior
    @Aggredior 5 місяців тому +1

    I was searching this fps solution like week ago and you guys make this video! Thank you.

    • @benharkervfx
      @benharkervfx 5 місяців тому

      you can even download a copy and set it as your desktop background sir!

  • @Neiyamax
    @Neiyamax 23 дні тому

    Отличный урок ,срасибо 👍...

  • @marioCazares
    @marioCazares 4 місяці тому +1

    Thank you I was wondering how to export for 24fps animations :D

  • @MaksimZiabkin
    @MaksimZiabkin 4 місяці тому

    Highly appreciate!

  • @Escelce
    @Escelce 5 місяців тому

    Great video. Thank you

  • @sameeruddin
    @sameeruddin 14 днів тому

    Pardon my math , but what would be the ideal setting for 25fps ?

  • @xpez9694
    @xpez9694 13 днів тому

    I cant get embergen to see my camera fro c4d in the fbx export. any tips I need to check?

  • @Shakrahn_is_a_creative
    @Shakrahn_is_a_creative 4 місяці тому

    Did u just render inside embergen and composited in after effects...or that exporting the vdb into blender and render togather

    • @jangafx
      @jangafx  4 місяці тому

      In this case its rendering in EmberGen and compositing in after effects.

  • @MauZarts
    @MauZarts 5 місяців тому

    Now I just want to know how I can export this back to Blender via vdb!

    • @jangafx
      @jangafx  5 місяців тому +1

      Add VDB export node. Plug the import nodes transform pin into the transform pin of VDB node. And then follow one of the many import VDB into blender tutorials out there. Or check out our discord where we have some pinned shaders in the blender channel. That's it, really simple!

  • @spellofstyle
    @spellofstyle 4 місяці тому

    My camera never matches exact location rotation as in blender I've tried gazillion times exporting changing settings but nope

    • @vaxs3993
      @vaxs3993 18 днів тому

      you can do animation in 60 fps . than compose it in davinci. you can play with time and scale.
      on my last video I do this.

  • @copperband43
    @copperband43 5 місяців тому

    What do I do if I import a 120 frame video? How do I set the frame stride and time step figures?

    • @Shakrahn_is_a_creative
      @Shakrahn_is_a_creative 4 місяці тому

      Just a quick help fam..if u ask any Ai module it's gonna help like to tell you the actual figures you are looking , be chatgpt or gemini

    • @Shakrahn_is_a_creative
      @Shakrahn_is_a_creative 4 місяці тому

      It works for me

    • @notwillclarke
      @notwillclarke Місяць тому

      do you mean 120 FPS? or the video itself is 120 frames?

    • @copperband43
      @copperband43 Місяць тому

      @@notwillclarke yes 120 frame

  • @toxicpurplesubstance6353
    @toxicpurplesubstance6353 5 місяців тому +3

    But you really shouldn't have to do this. If you're working on a 24 fps project, you should just be able to select that and everything should work out. And what's with the weird approach to scaling? Voxel sizes are displayed as meters but that means that a regular fires are dozens of meters tall and the default explosions hundreds of meters. You set your scaling correctly in your 3d package but then it's all wrong in Embergen and you have to reverse engineer what the "scale and center to fit" does when wanting to export your embergen VDBs back to your 3d app and it's just a tedious mess. Embergen is fast and produces great results and the quick visual feedback is awesome, but the lack of a proper workflow with other 3d apps makes it a headache.

    • @jangafx
      @jangafx  5 місяців тому +5

      We are well aware of all this and agree.
      For scaling, ignore EmberGen scaling. If you plug the import node transform pin into VDB nodes transform pin, we do ALL of the work for you.

  • @doctorkj5640
    @doctorkj5640 5 місяців тому

    Embergen will not be taken seriously until it has some sort of sparse solver. It is real time only for cheap looking game like explosions, but for VFX production there is so much to be desired.
    Imported mesh handling is bad and unintuitive and very finnicky. The initial fascination of real time fluid solver fades quickly when you realize it would be much faster and with much better results to go Phoenix FD route or even the very complicated Houdini.
    For realistic results Embergen is no faster than Phoenix. It can be pretty slow actually.

    • @jangafx
      @jangafx  5 місяців тому

      Its coming and in the works. Thanks for the feedback :)

    • @doctorkj5640
      @doctorkj5640 5 місяців тому

      @@jangafx Ok Great. While you're at it just make the domain movable instead of adjusting imported geometry.
      So... when? This year? 2030? 2035?

    • @jangafx
      @jangafx  5 місяців тому

      @@doctorkj5640 Movable domain is coming next month. EmberGen 2.0, probably going to need to go through a time travelling wormhole in space to know the date on that.

    • @doctorkj5640
      @doctorkj5640 5 місяців тому

      @@jangafx OK. 2035 it is! Thanks for the answer.