MediaPipe4U: Metahuman motion capture and face capture

Поділитися
Вставка
  • Опубліковано 9 чер 2024
  • MediaPipe4U is an UnrealEngine plugin that integrates Google Mediapipe technology for motion capture and puppeteering 3D avatar through webcam, videos, and images in realtime
    Project page:
    github.com/endink/Mediapipe4u...
    Note:
    Its only support UE 5.0.x, dont use it to UE 5.1 or UE 4.X.
    There is an license file in the plugin (More info about it, please read doc), which will expire at the end of December, but I will upload the new license file on github pages before it expires, which means that for personal users it is free, but it is difficult to use for commercial projects.
    PS: This is a long nanny-level tutorial that starts with creating a project to help you get started quickly with MediaPipe4U to drive Metahuman, covering motion capture, finger capture, expression capture (no Apple required, no Nvidia GPU natural face patch, ARKit 51 compatible BS)
    00:00 Project creation and plugin installation
    07:32 Metahuman Import
    11:00 Motion Capure Configuration
    23:00 Motion Capture Problem Handling
    26:14 Ground IK Configuration
    30:54 Face Capture Configuration

КОМЕНТАРІ • 48

  • @Bionicleman619
    @Bionicleman619 Рік тому

    Hello, I got the demo working and I did all the step with my Metahuman up till 23:00 when they start moving. Are you using a webcam? I started a blank scene are we suppose to use the demo scene with it's hud? I also tried to add the other addons but I couldn't find mediapipe live link.... Could I send you my project so you can see what I'm missing? The demo is fantastic!

    • @andersxiao3632
      @andersxiao3632  Рік тому +1

      please read doc or download demo project, in demo project , there is a face link map, you can see it.

    • @Bionicleman619
      @Bionicleman619 Рік тому

      @@andersxiao3632 I would pay someone a bit to set it up for me. Thanks for the help

  • @aurelianobuendia24
    @aurelianobuendia24 4 місяці тому

    Could this work for a kind of "realtime lipsinc" it doesn´t need to be good, just like reading an audio and that transform it into the lipsinc animations.

  • @offworldlive
    @offworldlive 4 місяці тому

    Got it working super smooth without having to create a Metahuman BP from scratch! What's the best way to contact you - would love to share this on our Discord streams :)

    • @andersxiao3632
      @andersxiao3632  4 місяці тому

      The only thing is need to renew the license manually, there are no restrictions on any sharing and any use purposes.
      github.com/endink/Mediapipe4u-plugin

    • @offworldlive
      @offworldlive 4 місяці тому

      Do you have an email so we can reach you directly? Thanks :) @@andersxiao3632

    • @offworldlive
      @offworldlive 4 місяці тому

      If possible we'd love to have your email to contact you directly. We are interested to know more about the plugin's source code and its pricing! Thanks :)@@andersxiao3632

  • @waym0
    @waym0 11 місяців тому

    What can I do if the green lines didn't disappear from my character at 30:12 minutes?

  • @Saechi
    @Saechi Рік тому

    is it possible to combine this BP with third person BP or will there be conflict?

    • @andersxiao3632
      @andersxiao3632  Рік тому

      Its a plugin for development, you need devlop programe by yourself.

    • @Saechi
      @Saechi Рік тому

      @@andersxiao3632 Okay, thank you. Second thing, I notice when it tracks me, the left arm is just automatically goes up like:
      \ o
      |\
      / \
      I've checked the distortion page, and I'm looking at the test source image to see what it's tracking, but it shows it's tracking my arm correctly.
      Is it a bone mapping issue? or a weight problem?

    • @andersxiao3632
      @andersxiao3632  Рік тому

      ​​@@Saechi Can you post an issue to GitHub? Where you can upload images, videos, better descriptions of the problem, and I will track the issue until it is resolved.

    • @Saechi
      @Saechi Рік тому

      @@andersxiao3632 okay!

  • @Saechi
    @Saechi Рік тому

    is it possible to turn off facial capture and use Iphone Live Link along side it? So mediapipe4u for body + IphoneLivelink for face?

    • @andersxiao3632
      @andersxiao3632  Рік тому

      This is componentized, I don't know what your problem is, you don't want to use the face component of course you can do without, or use a different live link subject .Am I understanding correctly?

    • @Saechi
      @Saechi Рік тому

      @@andersxiao3632 Yes, I want to use a different live link subject for the face

    • @andersxiao3632
      @andersxiao3632  Рік тому

      @@Saechi Mediapipe face capture will not work if you don't put the face component into the level, do you watch the tutorial

    • @andersxiao3632
      @andersxiao3632  Рік тому

      or set your subject to your iphone subject name

    • @Saechi
      @Saechi Рік тому

      @@andersxiao3632 yes I watched it. I was just wondering if I can have two mocap programs running at the same time. I know some programs conflicts with each other

  • @user-dc7bd5ws8r
    @user-dc7bd5ws8r Рік тому

    Face Capture will be work on android? Demo plugin has only windows build so i can't try build project on android.

    • @andersxiao3632
      @andersxiao3632  Рік тому

      I will release an android app, solve face blendshapes on android and send to unreal engine.

    • @user-dc7bd5ws8r
      @user-dc7bd5ws8r Рік тому

      @@andersxiao3632 I wana use it in my build, without additional app, can I?

    • @andersxiao3632
      @andersxiao3632  Рік тому

      No, cant do this, because its not an unreal engine app

  • @bccbnxc
    @bccbnxc 4 місяці тому

    could MediaPipe4U'motion capture export to animation clip in unreal?

    • @andersxiao3632
      @andersxiao3632  4 місяці тому

      sorry , only bvh format data can be exported.

  • @wuttichonaukkhosuwan1389
    @wuttichonaukkhosuwan1389 Рік тому

    Can it be used with two cameras?

    • @andersxiao3632
      @andersxiao3632  Рік тому

      Theoretically, two camera triangulations could be used to make depth more accurate, but this makes calibration quite difficult for user, and I don't plan to take this step yet.

  • @RECFOX
    @RECFOX 11 місяців тому

    After update Windows, the plugin stopped seeing video and photo files at all.

    • @andersxiao3632
      @andersxiao3632  11 місяців тому

      please post an issue on github, need more information...

    • @RECFOX
      @RECFOX 11 місяців тому

      @@andersxiao3632 After updating Windows, the old version of the plugin stopped working. But the new version with the new GStreamer works fine.

    • @RECFOX
      @RECFOX 11 місяців тому

      @@andersxiao3632 But now, i have a new question: How can I enhance fps playback video file in mediapipe toolkit?

    • @andersxiao3632
      @andersxiao3632  11 місяців тому

      @@RECFOX I think it's your license that has expired, the free license has to be downloaded every 30 days, it will be included in the latest plugin package, please read doc about license.

    • @RECFOX
      @RECFOX 11 місяців тому

      @@andersxiao3632 Oh!!.. My bad. Didn't think about it. There is also a problem with NvAR. When starting simulation and uploading video to the mediapipe toolkit, the UE just breaks and closes with an error. video card RTX 3060 12gb

  • @GioRob
    @GioRob Рік тому +1

    So this all works in real time now?

    • @andersxiao3632
      @andersxiao3632  Рік тому

      Of course, you can see it from the video.

    • @GioRob
      @GioRob Рік тому

      @@andersxiao3632 would be very helpful to see a video where you record your face in real time with the avatar moving !

    • @andersxiao3632
      @andersxiao3632  Рік тому

      @@GioRob Aha, sorry, because I really don't want to show my face

    • @ahmadzainudin366
      @ahmadzainudin366 Рік тому

      @@andersxiao3632 How we can show the input video from the camera?

    • @andersxiao3632
      @andersxiao3632  Рік тому

      @@ahmadzainudin366 Sure,please read the doc:
      opensource-labijie-com.translate.goog/Mediapipe4u-plugin/quick_start/texture_display.html?_x_tr_sl=zh-CN&_x_tr_tl=en&_x_tr_hl=zh-CN&_x_tr_pto=wapp