Using the Pose Asset for Joint Rigged Faces

Поділитися
Вставка
  • Опубліковано 30 січ 2025

КОМЕНТАРІ • 22

  • @bryan_txg
    @bryan_txg 3 місяці тому +1

    God bless you for sharing this for free

  • @MarksmanStudios
    @MarksmanStudios 2 місяці тому +1

    "Great tutorial! I'm currently trying to use Live Link Face for facial capture and Rokoko motion capture for body movements simultaneously on a single mesh character (Paragon model) in Unreal Engine 5.4. However, I'm struggling to combine both systems for live production.
    Could you explain how to properly set up both Live Link Face and Rokoko mocap to drive the same character in real-time? Any guidance on integrating them within a single mesh would be incredibly helpful. Thanks for the awesome content!"

  • @flufflepimp
    @flufflepimp Рік тому +3

    Brilliant work man! Most concise way of doing this I've ever seen

  • @LFA_GM
    @LFA_GM Рік тому

    Thank you for sharing. I remember this workflow was requested during one of the UE's livestream about Paragon Characters, but it never came to fruition. I really appreciate you showing all steps.

    • @TrashPraxis
      @TrashPraxis  Рік тому +1

      Thanks for your comment! This will work for all the Paragon characters with a rigged face but of all the characters, Boris will work the best I believe. This is mainly because for all of them, the eyes are meshed in. The eyes don't rotate independently of the face mesh and for the human characters this looks a bit shit

    • @LFA_GM
      @LFA_GM Рік тому

      @@TrashPraxis That's true, I've tested it and noticed the eyes move sticked together with eyelids. Such stretching doesn't look good. Thank you for the heads up.

  • @graemepalmer2306
    @graemepalmer2306 Рік тому +1

    Brilliant tutorial, thanks for making this!

  • @sahinerdem5496
    @sahinerdem5496 7 місяців тому

    this is very great, how i missed it last year. After that how can I use with exported audio2face metahuman facial animations with my this custom poseasset driven method. My all character are facial bones not belndshapes...

  • @original9vp
    @original9vp Рік тому +1

    Good to see you @TrashPraxis

  • @hardcorerick8514
    @hardcorerick8514 Рік тому

    Awesome!

  • @flufflepimp
    @flufflepimp Рік тому

    Do you happen to have that list handy?

    • @TrashPraxis
      @TrashPraxis  Рік тому +2

      // Left eye blend shapes
      EyeBlinkLeft,
      EyeLookDownLeft,
      EyeLookInLeft,
      EyeLookOutLeft,
      EyeLookUpLeft,
      EyeSquintLeft,
      EyeWideLeft,
      // Right eye blend shapes
      EyeBlinkRight,
      EyeLookDownRight,
      EyeLookInRight,
      EyeLookOutRight,
      EyeLookUpRight,
      EyeSquintRight,
      EyeWideRight,
      // Jaw blend shapes
      JawForward,
      JawLeft,
      JawRight,
      JawOpen,
      // Mouth blend shapes
      MouthClose,
      MouthFunnel,
      MouthPucker,
      MouthLeft,
      MouthRight,
      MouthSmileLeft,
      MouthSmileRight,
      MouthFrownLeft,
      MouthFrownRight,
      MouthDimpleLeft,
      MouthDimpleRight,
      MouthStretchLeft,
      MouthStretchRight,
      MouthRollLower,
      MouthRollUpper,
      MouthShrugLower,
      MouthShrugUpper,
      MouthPressLeft,
      MouthPressRight,
      MouthLowerDownLeft,
      MouthLowerDownRight,
      MouthUpperUpLeft,
      MouthUpperUpRight,
      // Brow blend shapes
      BrowDownLeft,
      BrowDownRight,
      BrowInnerUp,
      BrowOuterUpLeft,
      BrowOuterUpRight,
      // Cheek blend shapes
      CheekPuff,
      CheekSquintLeft,
      CheekSquintRight,
      // Nose blend shapes
      NoseSneerLeft,
      NoseSneerRight,
      TongueOut,
      // Treat the head rotation as curves for LiveLink support
      HeadYaw,
      HeadPitch,
      HeadRoll,
      // Treat eye rotation as curves for LiveLink support
      LeftEyeYaw,
      LeftEyePitch,
      LeftEyeRoll,
      RightEyeYaw,
      RightEyePitch,
      RightEyeRoll,

  • @wsterlingy
    @wsterlingy Рік тому

    It’s xmas in May. You posted a new video. I’d love to know your methodology of getting a 1:1 relation between an actor and an avatar, assuming you’ve figured it out. BTW, the new Manus meta gloves integrate beautifully with the latest version of Motive.

    • @TrashPraxis
      @TrashPraxis  Рік тому

      Hi Sterling, yeah I have been away from it but I'm back now! I don't fully understand what you mean about the 1:1 relation. Are you referring to something about retargeting? Also yes the Motive now also can integrate the Stretchsense gloves, which is so much better that way. I might make a short video about that soon too. We don't have the Manus meta gloves, only the old ones. Do you have them? Are they good?

    • @wsterlingy
      @wsterlingy Рік тому

      ⁠​⁠@@TrashPraxisI bought two pairs of the new Manus Quantum Meta gloves. They had a very clunky pipeline until Motive 3.0.3 was released. With this, the Manus data goes straight into Motive which gets baked in with the body data, which is very handy. 😀 This makes the AnimBP’s less complicated because everything comes in through one Live Link connection. The finger tracking is the best I have seen from what I have seen so far. It’s not perfect, and you aren’t going to play guitar with them. We have more testing to complete. I’ll keep you posted.
      I’m curious about your new StretchSense gloves.
      Lastly, my original question was regarding a 1 to 1 match between an actor and an Avatar, so that an avatar doesn’t suffer from sliding feet and the other such side effects. I seem to recall you mentioning this in an old video and that you blame OptiTrack. Happy Mocapping!1

    • @TrashPraxis
      @TrashPraxis  Рік тому

      @@wsterlingy oh nice! I have wondered how those look. We don't have any new stretchsense gloves, only the supersplay, which are the previous iteration. The new ones are very similar but with more flex sensors. They also can integrate with Motive now and it's so much better, as you noted.
      Regarding the sliding feet, the issue is that the live link skeleton from Optitrack plugin does not scale the bones, so unless your performer is precisely the same size as the skeleton you put the live link onto, you will see foot sliding. I think this is 100% an oversite from Optitrack. Their old way - pre-live link- did not suffer this problem, it scaled the bones as expected. I wish they would fix it because it's a lot of work to get around it. I am currently exporting a short piece of motion from Motive for my performer with the 'stick skeleton' option. You can bring this into UE. This was a nice recent addtion to Motive but unfortunately they fucked this up too and you can't stream onto it because it's got no root. So I have been bringing this into Maya, rearranging it and saving a new fbx, then streaming on to this

  • @Mind_ConTroll
    @Mind_ConTroll 7 місяців тому

    I am poor and can't afford maya, can I get that fbx, I have that same bear, and I wanna make 1 video with him talking

    • @TrashPraxis
      @TrashPraxis  6 місяців тому

      I don’t think I could find that file. You can do it in unreal with the sequencer and fk control rig

  • @tradinglikeanoob
    @tradinglikeanoob Рік тому

    sir please make a video , how to import ( rokoko AI ) RIG into unreal and applying the aimation in METAHUMAN in UE 5.2

    • @TrashPraxis
      @TrashPraxis  Рік тому

      I don’t have ( rokoko AI ) RIG, sorry