BlendArMocap is getting wild - custom transfer, freemocap and more

Поділитися
Вставка
  • Опубліковано 11 лип 2024
  • Get BlendArMocap
    github.com/cgtinker/BlendArMo...
    Read the Docs:
    cgtinker.github.io/BlendArMocap/
    Get Freemocap
    freemocap.org
    00:00 Basics and Detection Features
    05:20 Freemocap Import
    06:30 New Transfer!
    08:30 Transfer Concept
    11:40 Value Mapping
    14:47 IK Chain
    18:03 Value by Distance
  • Фільми й анімація

КОМЕНТАРІ • 122

  • @SquaresToOvals
    @SquaresToOvals Рік тому +6

    This looks incredible. I remember spending a lot of time with a kinect in mikumikucapture years ago, and have been surprised to see no good models releasing with face, body, and hand tracking all-in-one. I hadn't known about mediapipe holistic until now.
    The best of luck with this project! I think I will learn blender now :)

  • @HeelHeatTV
    @HeelHeatTV Рік тому +3

    Crazy, I was messing with this last night and thought it was amazing! Thank you!!!

  • @obeycelestia
    @obeycelestia Рік тому +3

    Man this is SO exciting, especially shaking hands with the FreeMoCap project is really cool to see! This is PRIMO

  • @ssanyeux
    @ssanyeux Рік тому +3

    dude i love you. thank for creating an alternative. flexible enough for diy'ers (specially those that lack all the best equipment) and professionals.

  • @soothingillustration50
    @soothingillustration50 8 місяців тому

    This is really impressive! You deserve a lot of support and donations 👏

  • @marcosaltamirano8776
    @marcosaltamirano8776 Рік тому +1

    Your job is amazing, the pose and hands mode are the best ones.

  • @4U4U7
    @4U4U7 8 місяців тому

    You are a legend! Appreciate your work, thank ever so much. Keep well and keep up the excellent work.

  • @s2mann
    @s2mann 5 місяців тому

    I use this with Blender 3.6. I haven't used mocap before and this made it incredibly easy. The tutorial is a must but after that, pretty smooth sailing. Thanks!!

  • @hbarone
    @hbarone Рік тому

    Incredible job !! Looking forward to testing

  • @RitmuKids
    @RitmuKids Рік тому +2

    Worked great, installed everything. Here are some suggestions: a preview of the video detection progress and the option to be able to choose from which frame the video detection starts and ends. Congratulations on the excellent work!

  • @Leukick
    @Leukick Рік тому +1

    Subscribed! Look forward to updates like Auto Rig Pro compatibility

  • @jamesmcandrew7860
    @jamesmcandrew7860 Рік тому +1

    I believe it's an AirTrack. BUT Holy crap! This is awesome!!! Great work!

  • @gaudenzin
    @gaudenzin Рік тому +1

    Awsome work!

  • @kttdestroyer
    @kttdestroyer Рік тому

    Just found this few days ago... Just wanted to say thank you 🙏

  • @SimpHarderPlz
    @SimpHarderPlz Рік тому

    This is a magnificent update
    Thanks

  • @Ready-Assets
    @Ready-Assets Рік тому +1

    Thanks for update! Super

  • @themineurs
    @themineurs 10 місяців тому

    you are creating open source full motion tracking solution it is insane thank you so much

  • @NandukDaProot
    @NandukDaProot Рік тому

    I dont have any more words than just beatiful amazing and amazing.

  • @Bordres
    @Bordres Рік тому +1

    Amazing! Was wondering when you could use videos for mocap, as my webcam isnt the best in quality haha
    Looking forward to more updates on BlendArMocap!!!!

    • @cgtinker
      @cgtinker  Рік тому +2

      Videos can be used for a while by now, I made it the default as it has been overlooked by many people :0

    • @MrGamelover23
      @MrGamelover23 Рік тому

      @@cgtinker The fact that you were able to make real-time motion capture work in blender is amazing! Any chance you could take this tech and turn it into something that works independently of blender, so that I could be used, for instance, with other animation software, or with video game engines or into a tracker for vtubing software? Because if you could make this work in a video game engine, this could change the game for amateur v-tubing.

  • @apachon
    @apachon 10 місяців тому

    This is amazing !

  • @realjames1
    @realjames1 Рік тому

    this looks amazing since I'm planning to do an animation project

  • @jihamih1219
    @jihamih1219 Рік тому

    nice, thank you!

  • @TheRealJerseyJoe
    @TheRealJerseyJoe Рік тому

    Awesome !

  • @gutzimmumdo4910
    @gutzimmumdo4910 Рік тому +1

    amazing

  • @patricki2071
    @patricki2071 10 місяців тому

    You need more folowers thats amazing and with the gnu license thats a very helpfull mocap addon thank you

  • @vivianwarran8804
    @vivianwarran8804 Рік тому

    Thank you for your video. How do you get the meta rig into a model? Thank you.

  • @mb345
    @mb345 Рік тому

    Can you place your video in another part of the UI so we are able to see what options you select on the right side of the screen? Awesome tool you built!

  • @Vidyut_Gore
    @Vidyut_Gore Рік тому

    This looks amazing. I read the docs. I didn't understand why blender needs to run as admin.

  • @isaacwardmusic
    @isaacwardmusic 7 місяців тому +1

    What should be done if you're mesh overlaps with itself after applying mocap data?

  • @DanielPartzsch
    @DanielPartzsch Рік тому

    Thanks. Is it not possible to also use rigify rigs with the new face rig? Even if I just would like everything but the face mocap? Also does transfer mean it's gets linked via drivers or does it bake them movements to the rigify rig then? Thanks again!

    • @cgtinker
      @cgtinker  Рік тому +1

      For the new face you'd have to create a new config file - I think renaming targets should do. Linking is via driver's and constraints, this intended. This allows you to modify the data more easily using the driver settings in the constraint panel.

  • @danyalghani7421
    @danyalghani7421 6 місяців тому

    Amazing stuff, truly! And the option to import from freemocap translates into multicam goodness for us DIYers! Question: Have you considered doing your own multicam setup BlendArMocap within Blender?

    • @cgtinker
      @cgtinker  5 місяців тому +1

      don't plan to develop it further for now

    • @danyalghani7421
      @danyalghani7421 5 місяців тому

      @@cgtinker Oh thats sad, but understandable, keep up the good work!

    • @cgtinker
      @cgtinker  5 місяців тому

      @@danyalghani7421 thanks :) hope you doing well!

  • @mbxymail
    @mbxymail Рік тому +1

    I have a suggest it seem the nonlinear function cannot be use to make adjustments? Right? Could you create a button that would take the final rig and create keyframe for the rig and release /delete the empty and delete constraints and just have keyframes
    . That way the nonlinear blender function cluld be used. That would advance mocap. THANKS

  • @liweilin6649
    @liweilin6649 Рік тому

    when i done transfering driver, how do i adjust Rigify pose? i cant keyframe or move rigify rig

  • @isaacwardmusic
    @isaacwardmusic 7 місяців тому +1

    Is it possible to remove the mocap data after transferring it?

  • @MrGamelover23
    @MrGamelover23 Рік тому +1

    The fact that you were able to make real-time motion capture work in blender is amazing! Any chance you could take this tech and turn it into something that works independently of blender, so that I could be used, for instance, with other animation software, or with video game engines or into a tracker for vtubing software? Because if you could make this work in a video game engine, this could change the game for amateur v-tubing.

    • @cgtinker
      @cgtinker  Рік тому +1

      should work but guess I don't have the time to maintain stuff like that (kinda capped with my current tools..). On which video engines u thought? just curios

  • @showhuiy933
    @showhuiy933 Рік тому

    Android's blendarTrack cannot compress files for sending to a computer. How do I set up the application?

  • @wbcast9698
    @wbcast9698 Рік тому

    Can you do a demo of face with video and sound in blender. I loaded BlendArMocap and a audio but it seems the transfer of mouth speaking is not working. Please thanks

  • @dpersona
    @dpersona 8 місяців тому

    How to combine retargets of blendartrack facial data and mixamo animation?

  • @cartoonforkids-usa725
    @cartoonforkids-usa725 6 місяців тому

    My webcam doesn't work in Blender 4.0.2 Mac OS 14.2.1, what should I do? Installed everything you need.

  • @blented
    @blented Місяць тому +1

    the app is gone from the iphone app store :( what happened?

  • @zoruken_4w4
    @zoruken_4w4 Рік тому

    Thanks for this amazing update!!!!
    oh, also blenderARtrack will have any update?
    the app is cuting the framerate at 15fps unfortunately...
    Also wasnt unable to set up a 1920×1080p resolution, but isnt a big issue compared to framerate.
    (even with lower settings)
    also it have some small tracking displacements when one uses different angles or target the camera to up into sky or turn around.
    A external stabilizer gimball fix a lot of the camera shake for lucky ☆

    • @cgtinker
      @cgtinker  Рік тому +2

      planning to switch the video recording subsystem in the future.. will take time - but lets call that a yes^^

  • @tobiasb.7516
    @tobiasb.7516 Рік тому

    hi there, is a full tutorial planed ? like for a stickman or whatever i have problems to transfer it to an character -_- and i already tryd to understand your videos

  • @hata_0
    @hata_0 4 місяці тому

    After I reassigned the animation to rigify in blender, it did not capture the center of mass information. How can I solve the problem?

  • @PecoraSpec
    @PecoraSpec 8 місяців тому

    i noticed that all my recording creates uncanny results where the face is so badly shaky! What should I do?

  • @ArpittheFact
    @ArpittheFact 11 місяців тому

    Hwy can we put animation in other rig also because when i start animation translation it stop other armature animation please reply 🙏🙏🙏🙏🙏

    • @cgtinker
      @cgtinker  11 місяців тому

      I don't fully understand sadly. Do you want to "retarget"? Also, due to the nature of the add-on, driver objects get deleted when a new config gets applied, the add-on is not intended to animate multiple rigs in one scene.
      ua-cam.com/video/ZfLU2NXWsUE/v-deo.html

    • @ArpittheFact
      @ArpittheFact 11 місяців тому

      Thank you so much for your reply
      As can I export a animation armature and import it again in blender so that I can use animation in different rig in Blender

    • @cgtinker
      @cgtinker  11 місяців тому

      @@ArpittheFact you can use this rigify extension I made to make the export easier:
      github.com/cgtinker/rigify_gamerig_extension
      link the metarig to the control rig, then bake some animation(s), unlink it and export it as fbx by example once you are done

  • @juvenilenayem
    @juvenilenayem 9 місяців тому

    Does this system work with IClone character?

  • @Hepworks
    @Hepworks Рік тому

    Hi - is there someway to see the frame rate of the motion capture data? I could see on the BlendAR app that it seemed to vary. I'm trying to figure out how best to sync it to audio.

    • @cgtinker
      @cgtinker  Рік тому +1

      for blendarmocap with video files that should basically work out of the box. I also synched audio with blendartrack (iOS) successfully - using a clapper helps a lot... as long the frame rate is consistent there shouldn't be an issue

    • @Hepworks
      @Hepworks Рік тому

      @@cgtinker for the blendartrack does it allow you to record the audio? I ended up making weird faces to use as my clapper!...and was recording my audio into OBS - my problem was partially for sure having different frame rates between blender and premiere...and OBS for that matter...I just need to smooth out my workflow. Follow up question: is the data from blendarmocap as high resolution as from blendartrack? What a great set of tools you've created!

    • @cgtinker
      @cgtinker  Рік тому

      @@Hepworks you're welcome :)
      oddly enough I'd say blendarmocap is somewhere in between! arcore (android) < media pipe (desktop) < ARKit (iOS)
      think I'll soon switch to shape keys, I think that will improve transfer to faces a lot, point based is not to great. I consider to make the most tracking stuff standalone in a separate exe as it's easier to maintain for me.

  • @geraldbryan7249
    @geraldbryan7249 Рік тому

    Hi, there is a problem when i transfer animation, the arms of my rig doesnt move, but everything else does. Do you have any idea why, and how do i fix this? Thanks!

    • @cgtinker
      @cgtinker  Рік тому

      most likely the rigify rig has not been generated

  • @marcosaltamirano8776
    @marcosaltamirano8776 Рік тому +1

    Is great, but i have a little problem the face have a bit issues , the markers show a precise movement o the face, but the face rig doesn't move the same way.

    • @cgtinker
      @cgtinker  Рік тому +1

      checkout the distance driver setup, you can modify the results and maybe even enhance the transfer config.

  • @Wolv21
    @Wolv21 Рік тому +2

    Hi! I'm running Blender 3.4.1 and I have session data from freemocap. But when I click Load Session Data I get the error message "Session directory not valid. D:\videos\mocap session\session_2023-03-11-10_29_10\"
    Wonderful work you are doing here (and Jon). If I can redirect more information to you let me know!

    • @cgtinker
      @cgtinker  Рік тому

      Thanks, uhm, does the folder contain the folders "DataArrays" and "SyncedVideos"?

    • @Wolv21
      @Wolv21 Рік тому

      @@cgtinker I see the folders "annotated_videos", "calibration_videos", "output_data" and "synchronized_videos"

    • @cgtinker
      @cgtinker  Рік тому

      Seems like folders have been renamed on the freemocap side? Are in the output folder .npy files?

    • @Wolv21
      @Wolv21 Рік тому

      @@cgtinker yes i see in "output_data" mediapipe_body_3d_xyz.npy mediapipe_face_3d_xyz.npy, etc.
      in "synchronized_videos" Camera_003_binary.npy Camera_004_binary.npy

    • @Wolv21
      @Wolv21 Рік тому

      also running the freemocap-gui to do the calibration / capture / sync / export to blender, so i'm not sure if that has something to do with it.

  • @ok-hc4he
    @ok-hc4he Місяць тому

    How do you add an actual model to all this

  • @dawitadmasu-vo6mu
    @dawitadmasu-vo6mu Рік тому

    it is really cool!!
    could you share me the way, to fix the problem of installation of mediapipe failed, check the system console output when I try to install install dependencies?

    • @cgtinker
      @cgtinker  Рік тому

      create a GitHub bug report and give me some info's there (hard to debug in YT). Let me know the following things:
      1. Operating System, Blender Version, copy logs from the system console and either attach them as .txt or paste them
      github.com/cgtinker/BlendArMocap/issues/new/choose

    • @dawitadmasu-vo6mu
      @dawitadmasu-vo6mu Рік тому

      thank you my brother, now it is safe!!!!

  • @hrishis
    @hrishis Рік тому

    Just curious. What linux distro is that ?

  • @kingskye7110
    @kingskye7110 Рік тому

    Can you provide a separate video to introduce the general algorithm from MediaPipe to bones? I'm not very familiar with Blender

    • @cgtinker
      @cgtinker  Рік тому

      there isn't a general algorithm. I've started working on a pypackage but it's not done yet
      github.com/cgtinker/mediapipe_rotations

  • @mica.motion
    @mica.motion Рік тому

    I dont understand how to edit the animation after the transfer is done. Help please.

    • @cgtinker
      @cgtinker  Рік тому

      all the data inside the collections uses keyframes.
      (besides the objs with a .D suffix (which are just drivers which is based on it). You can modify driver values using the mask presented in the video, besides working with the keyframed data is gd swell

  • @soitk1825
    @soitk1825 Рік тому

    Will BlenderArMocap work with Mixamo Rigs?

    • @cgtinker
      @cgtinker  Рік тому

      technically yes, the tools are there to do so within the add-on but I don't create custom configs

  • @drugdog-qp8ww
    @drugdog-qp8ww 7 місяців тому

    How would i bake the motion from driver to keyframe

    • @cgtinker
      @cgtinker  7 місяців тому

      baking works the same as usual, just bake to visual

  • @sarfarajsaiyyad4057
    @sarfarajsaiyyad4057 Рік тому

    It's great cg tinker Can you please make video on How we can transfer ther capture data onto another model ? Or how we can export this data in fbx format for macap animation?

    • @cgtinker
      @cgtinker  Рік тому +1

      Guess I'll make a video about exporting soon as many seem to have issues doing so.
      The video should actually give you the knowledge on how to transfer to another model.

    • @shoaibakhtar4934
      @shoaibakhtar4934 Рік тому

      @@cgtinker so if we select the armature of the model and apply the mocap to it will it automatically work

    • @cgtinker
      @cgtinker  Рік тому

      @@shoaibakhtar4934 the config is for rigify, you can change the config to support other / similar rigs. so if the armature is a generated rigify rig - yes

  • @haganji
    @haganji Рік тому

    How can I export fbx file include animation from the rigged file?
    Actually I tried it, but every time I failed...

    • @cgtinker
      @cgtinker  Рік тому

      You got to bake before exporting.
      Where you want to export to?

    • @haganji
      @haganji Рік тому

      @@cgtinkerHow can I explain you where I tried to expor to... I want to explain you with picture, because I cannot explain you exactly. Let me know your email address please...

    • @cgtinker
      @cgtinker  Рік тому

      @@haganji hello@cgtinker.com

    • @haganji
      @haganji Рік тому

      @@cgtinker I have sent you email. Check it out please.

    • @cgtinker
      @cgtinker  Рік тому

      @@haganji I'll :) give me 2-3 days

  • @manuelbarrios2732
    @manuelbarrios2732 Рік тому

    Is it possible to transfer the data to another armature? like auto rig pro or mixamo for example?

    • @cgtinker
      @cgtinker  Рік тому

      should work, yes, for autorig it may be sufficient to just change targets without touching the drivers. (I don't have autorig but it seems to use rigify).

    • @misha4sculpt
      @misha4sculpt Рік тому

      @@cgtinker hey...thanks for the update...I want to try to create a target rig for the metahuman rig...let me know if you have an outline to get started or any other suggestions....if I can follow an outline for the rigify rig, let me know...

    • @cgtinker
      @cgtinker  Рік тому

      @@misha4sculpt the rigify config delivers are fairly nice outline to do so. Most should work by just switching targets

    • @misha4sculpt
      @misha4sculpt Рік тому

      @@cgtinker where do I find that...

    • @misha4sculpt
      @misha4sculpt Рік тому +1

      ..never mind...found it

  • @Kozlov_Production
    @Kozlov_Production Рік тому +1

    Sorry, but I didn't find a way to transfer face animation to HumanGenerator models

    • @cgtinker
      @cgtinker  Рік тому

      In the video is everything you need to know to create a custom configuration to do so. It's some work though. However, once a config has been generated it can be shared so hopefully the transfer config pool grows over time

    • @Sgirlie755
      @Sgirlie755 Рік тому

      Have you figured out how to transfer face animation to Human Generator models? I am stuck at that part of the transfer too.

    • @Kozlov_Production
      @Kozlov_Production Рік тому

      @@Sgirlie755 no

  • @francez123456789
    @francez123456789 Місяць тому +1

    sad this isn't maintained anymore. if I knew how I would offer but I do not.

  • @kendarr
    @kendarr Рік тому

    When I use it to make face animations, the animations is too smooth on the character, why idea why? it moves very little

    • @cgtinker
      @cgtinker  Рік тому

      probably you have to increase the influences. you can change up the settings of the mapping objects and should be fine

    • @kendarr
      @kendarr Рік тому

      @@cgtinker Will try that thanks, I scaled the empties to a large scale and that 'fixed it', but the tracking doesn't really convey the lip motion super well (on my model atleast)

    • @cgtinker
      @cgtinker  Рік тому

      @@kendarr sometimes it requires some cleanup / change of settings - planning to improve the face transfer results in the future.
      If you go in the "cgt_face" collection you will find empties named something like "cgt_lid.R.L" - in the object data props (where the constraints live) you'll find custom settings which help to improve the mapping

  • @faizankhan6045
    @faizankhan6045 4 місяці тому

    I used 12 key frame😂😂 took 2 min for 30sec

  • @Gray-Today
    @Gray-Today Рік тому

    I hope you get heat in your place.