blendartrack - ar face tracking to rigify rig transfer

Поділитися
Вставка
  • Опубліковано 24 гру 2024

КОМЕНТАРІ • 230

  • @NanoDraws
    @NanoDraws 2 роки тому +20

    Usually, I don't write any comments but you are an incredible lifesaver (and mostly timesaver). I'm initiating myself to 3D modeling and animation, and as a beginner I was totally not seeing myself animate faces for hours (moreover probably for a poor result). It's really easy to use and the result is impressive. So sincerely thank you for sharing this !
    And for anyone not understanding why the face rig does not move, it's absolutely not an issue (that's okay), it's because the face rig is useful for the driver rig generation, which this one, moves
    (Precising to avoid someone else being stuck for nothing like me lmao)

    • @cgtinker
      @cgtinker  2 роки тому

      thanks a lot, glad it enables you to move further :)
      keep on creating!

  • @verticallucas
    @verticallucas Рік тому +1

    I probably already left a comment at some point, but wanted to say this addon is phenomenal. I really hope you keep updating because it's such has such a streamlined usage and is quite straightforward and effective

  • @activemotionpictures
    @activemotionpictures 3 роки тому +5

    This is impressive! So the empty cloud generates (snaps to) a close rigify facial bone. This is an amazing improvement! Thank you for sharing this!

    • @cgtinker
      @cgtinker  3 роки тому

      glad you like it! here what's happening:
      - The global rigify super.face and local bones are getting aligned with the empties (doesn't take parent scale into account yet)
      - The generated rigify rig drivers are getting constrained to the empties
      At first I planned to make any rig snap to the empties, but after some testing I realised that this cannot work out. Usually many drivers are getting used when creating face rigs, that's why I went for the rigify face rig approach :)

  • @JadonLolley
    @JadonLolley 2 роки тому +1

    this is great. how many times now have i watched? thanks dude!🙌

  • @永麒張-h4n
    @永麒張-h4n 2 роки тому

    The teacher speaks very well.

  • @Ochenter
    @Ochenter 3 роки тому +2

    Lovely tutorial, Mister.
    Thanks.

  • @garywpearson1955
    @garywpearson1955 Рік тому

    I'm still having fun with you app! Working on more cartoons. Thanx again!

  • @HarryMcKenzieTV
    @HarryMcKenzieTV 2 роки тому

    hallo! thanks for sharing! unfortunately the last link in your video description about blender track does not open. is there another video about how to use blender track?

  • @DOCsanimations
    @DOCsanimations Рік тому

    Thanks for making great add on it works fine and again thanking you for making it free because my type of bigner can afford that thanks for everything

  • @jkneifl15
    @jkneifl15 2 роки тому

    this looks like a very smart way to go about rigging a face. I look forward to trying it out.
    Thank you for sharing.

  • @3dApe
    @3dApe 2 роки тому

    Man, I'm definitely going to support you on patreon once I have a little more money

  • @F1dg3t
    @F1dg3t 2 роки тому

    Thank you, you have no idea the wonders this is doing for my production.

  • @felixfokoua2184
    @felixfokoua2184 2 роки тому

    DAMN thanks a lot. This is one of the best add-ons on Blender for animation

  • @ShaneDotz
    @ShaneDotz 2 роки тому

    That’s a great workaround I never thought of!

  • @PriestessOfDada
    @PriestessOfDada 2 роки тому +4

    Thank you thank you thank you! I have been searching for how to do this particular thing for a month. It's going to save me so much time when making blendshapes. You have no idea. Thank you SO much

  • @aion3232
    @aion3232 Рік тому +1

    Is it possible to make Blendartrack work with Humangenerator?

  • @walejaw3d
    @walejaw3d Рік тому

    bro this video perfection. Thankyou

  • @3dpprofessor
    @3dpprofessor 2 роки тому

    Does this work in 3.3? When I try to import nothing happens. no error message, nothing.

  • @rogoz8958
    @rogoz8958 11 місяців тому +1

    App not supported on newer android devices on the playstore. Is there a safe link to install it unofficially ?

  • @isaacwardmusic
    @isaacwardmusic 9 місяців тому +1

    I'm having an issue where after I do the whole process, only one of the bones in the character's face rig is moving. Do you have any idea why?

  • @顏慈白-j3o
    @顏慈白-j3o 2 роки тому

    Thank you for your kind teaching.:)

  • @KwasiAnimationStudio
    @KwasiAnimationStudio 2 роки тому

    Definitely glad this popped up on my feed. Because I need facial tracking because I am in process of making a feature film project and need this. ✊🏾

    • @cgtinker
      @cgtinker  2 роки тому

      If U are just looking for face tracking, check out my mobile app blendartrack - it's better for facial animation at the moment

  • @MikuDanceAnima
    @MikuDanceAnima 2 роки тому

    Thank you very much, this is great, definitely have to buy it.

  • @jamesmcandrew7860
    @jamesmcandrew7860 Рік тому +2

    Hey! I LOVE how this works and your tutorial! Unfortunately, the only action that works after copying the bake to the bones are the bones around the eyes. PLEASE HELP!

  • @eb-
    @eb- 2 роки тому

    I'm having trouble at 3:18. My mesh is not moving with the control_rig. I tried it twice. Made the control_rig a parent of the mesh. Tried changing positions of the armature within the face. No luck. Help...

  • @santoshgujar5237
    @santoshgujar5237 3 роки тому +1

    Thank you, Sir

  • @gert-janakerboom1314
    @gert-janakerboom1314 2 роки тому

    amazing addon, thank you so much !

  • @Animotions01
    @Animotions01 3 роки тому +1

    Thanks for this!!!

  • @dwassortedmedia
    @dwassortedmedia 3 роки тому +3

    I seem to be having trouble with getting my action to apply to the rigify face rig. I am able to bake the action without error but when I select the rigify armature and click the action from the dropdown nothing seems to be transferred :/

    • @cgtinker
      @cgtinker  3 роки тому +1

      Are you sure that you selected the same action? Did you check if the driver rig contains keyframes? Is the rigify rig you want to transfer too "generated" (doesn't work with just the meta rig)?

    • @penniesonoureyes3436
      @penniesonoureyes3436 2 роки тому

      I had the same problem - I'd forgotten to do a Rigify Generate Rig on my destination face (at 02:30 in the video)

    • @famitory
      @famitory 2 роки тому

      i had thought i had this problem, but what was actually happening is because i had to scale the transfer back down to scale 1 to get the bake to not mangle, the resulting movements were invisible on the larger scale rigged model

  • @zoobloo
    @zoobloo 9 місяців тому +2

    Can't find the app anywhere anymore ): Tried both IOS and android devices

  • @Sjostrom2001
    @Sjostrom2001 Рік тому

    For some reason the camera does not switch to show the inner camera whenever i choose face tracking, it just shows the outer camera

  • @dzagri4407
    @dzagri4407 2 роки тому

    This is so cool!
    You just made life a bit easier and thanks for that😀

  • @zUMERSAL
    @zUMERSAL 2 роки тому +1

    thank you for the very informative video, does you know if deleting bones effects the blendartracker?

  • @eldarra562
    @eldarra562 Рік тому

    it's good tutorial!Thank's man!

  • @MemeHeaven.0
    @MemeHeaven.0 Рік тому

    Can you do this with the new snow v2 character from blender that is fully rig I have tried but seem to be not working at all???

  • @lkgdmusic9634
    @lkgdmusic9634 2 роки тому

    cheers mate!!What a legend!!

  • @hingakoroma2071
    @hingakoroma2071 2 роки тому +1

    Hi I couldn’t get blendartrack to go selfie mode so I can’t track my face

    • @cgtinker
      @cgtinker  2 роки тому

      are you using iOS or android?
      iOS X+ is required for face tracking

  • @scrambles1230
    @scrambles1230 2 роки тому

    Hey I tried setting this up and even after baking to action my mesh is not moving

  • @alan112223
    @alan112223 2 роки тому

    That is just amazing!

  • @mdstudios380
    @mdstudios380 7 днів тому

    Is there a way to apply face mocap data to a mesh with blend shapes?

  • @npereyra7054
    @npereyra7054 2 місяці тому +1

    A girl in a UA-cam blender video do a shark lowpoly charácter like vtuber, can u maybe do more of this facerigs?

  • @doodledog5080
    @doodledog5080 Рік тому

    The jaw control fully moves the whole bottom of the neck as the jaw opens, I’m unsure how to fix it :(

  • @dougrutledge532
    @dougrutledge532 Рік тому +1

    Is there a way that you could include audio recording as well? Even in a simple format?
    The reason why I ask is that I'm trying to animate a scene, and if the audio track and the animation track are synced, that'd be a time saver.

  • @AdrianParkinsonFilms
    @AdrianParkinsonFilms 3 роки тому +8

    I just gave this a try and it works really well. I would recommend that after you've transferred the animation, you add another NLA strip and use that to further refine areas that might not track perfectly. So in the test I just did, the lips didn't go narrow enough with OH sounds, so I simply brought the corner controls in.
    Is there a way to set the frame rate of imported motion? I typically work at 24 FPS rather than 60.

    • @Cripthulu
      @Cripthulu 2 роки тому +1

      You should be able to scale the keyframes in the timeline on the x-axis you just have to find the appropriate amount to scale them down by. For example if you worked at 30fps you would scale them down so that they were halved. This video: ua-cam.com/video/4LnGFtGjk2E/v-deo.html talks about it more specifically. I've timestamped it to make it easy for you to find :-) I hope this helps!

    • @oyentemaniatico
      @oyentemaniatico 2 роки тому

      @@Cripthulu very useful thank you

  • @KaasTVNL
    @KaasTVNL 3 роки тому +1

    Thankyouuu! Wil blink be supported in the futere?

    • @cgtinker
      @cgtinker  3 роки тому +2

      Currently it's just available for iOS. Guess in a couple of years it will be available for Android too :)

  • @Ollie_sm
    @Ollie_sm 3 роки тому +1

    awesome!

  • @aadityarai7027
    @aadityarai7027 Рік тому

    Thanks bro you are life saver❤❤❤

  • @michaeltyers7336
    @michaeltyers7336 Рік тому

    Can we use this technique if the target face is non-humanoid? Like a talking horse or a dragon?

    • @cgtinker
      @cgtinker  Рік тому +1

      well, kinda, but you gotta create a rig for that and prolly calculated some data to get there. I don't think it's possible to automate something like this, shape keys are probably the closest but still, it requires user effort

  • @domenicomastandrea7981
    @domenicomastandrea7981 Рік тому

    Fantastic tutorial
    Where i can find the Rigify Feature Set like the Super Face?

    • @cgtinker
      @cgtinker  Рік тому

      The list of available rig types appears in the Bone properties tab when the bone is selected in Pose Mode. Scroll down the Properties editor to find Rigify Type panel.

  • @shivangipriya4153
    @shivangipriya4153 2 роки тому

    Can you please show me how you setup the app : blandartack on Android which I could not. Please help me

  • @mikeben8769
    @mikeben8769 2 роки тому

    Great work! I want to know if this face animation could work with Readyplayerme avatars! Thanks.

  • @fihruhafid
    @fihruhafid 2 роки тому

    AMAZING!

  • @lincolnsmithtbp
    @lincolnsmithtbp 3 роки тому +8

    I am a total beginner in character animation and mocap, and in Blender. I found your tutorial here a little too fast to follow but that's because I'm still new to so many of the basics but I've successfully made it work within Blender, but never transferring from the capture to a custom character. Where do you recommend getting started if I want to focus on learning this field of CG? (16 year VFX veteran of compositing and supervision, just trying to learn this) Thank you.

    • @cgtinker
      @cgtinker  2 роки тому +4

      Hey Lincoln, I just recognised your comment - sorry for the late answer. how is it going?
      For learning rigging, I can recommend cgdive. I've started to make some new tutorials about rigging and plan to update blendartrack soon to make the facial animation transfer easier.
      In which fields are you interested? There are lots of great resources around :)

  • @ParOk_Art
    @ParOk_Art 6 місяців тому

    YYYY so cuuuuuuul

  • @rebelllion8853
    @rebelllion8853 Рік тому

    Hi ! I tried with the Blender 3.5 - Generate Rig doesn't work - python problem - do you know any issues with this version ?

  • @user-tc1et8rm8f
    @user-tc1et8rm8f 2 роки тому

    I can't understand how trasfer animation data to bones, like you have in beginning in video, I need to export this bones animation to UnrealEngine, but with control rig animation i cant... And i try 100 times to attach control rig to bones and never work for me even in different versions Blender

    • @cgtinker
      @cgtinker  2 роки тому +1

      made the process easier in v2.2.0 ;)

    • @user-tc1et8rm8f
      @user-tc1et8rm8f 2 роки тому

      @@cgtinker thanks your works insane

  • @SHA3DOW_
    @SHA3DOW_ 2 роки тому

    Unfortunately not compatible with my device.

  • @NamastayGangstaArt
    @NamastayGangstaArt 2 роки тому +1

    Can this work with auto rig pro?

  • @penniesonoureyes3436
    @penniesonoureyes3436 2 роки тому +1

    Just tried this with Blender 3.1 and got an error when trying to generate the driver rig. Went back to Blender 3.0 and it worked OK.

    • @penniesonoureyes3436
      @penniesonoureyes3436 2 роки тому +1

      Here's the error in full:
      location: :-1
      Error: Python: Traceback (most recent call last):
      File "C:\Users\tim_r\AppData\Roaming\Blender Foundation\Blender\3.1\scripts\addons\blendartrack-main\src\interface\Operators.py", line 56, in execute
      input_manager.generate_driver_rig()
      File "C:\Users\tim_r\AppData\Roaming\Blender Foundation\Blender\3.1\scripts\addons\blendartrack-main\src\management\input_manager.py", line 68, in generate_driver_rig
      rig = armature.get_armature("rig")
      File "C:\Users\tim_r\AppData\Roaming\Blender Foundation\Blender\3.1\scripts\addons\blendartrack-main\src\utils\blend\armature.py", line 10, in get_armature
      armature = bpy.data.objects[name]
      KeyError: 'bpy_prop_collection[key]: key "rig" not found'
      location: :-1

    • @maiers24
      @maiers24 2 роки тому

      @@penniesonoureyes3436 I got the same error in blender 3.0 :(

    • @cgtinker
      @cgtinker  2 роки тому

      @@maiers24 this may happens because your diver rig name isn't 'rig' - the current version is using hard coded naming conventions (which I'll fix soon).

  • @DigitalImageWorksVFX
    @DigitalImageWorksVFX Рік тому

    I did everything, step by step and movement won't transfer to rigify rig :/ For example: mouth don't move on it, only bones around the eyes (but in not the same way as on imported model). Scale is applied, I don't know what to do with it.

    • @cgtinker
      @cgtinker  Рік тому

      Did you try transferring to a *generated* rigify rig?

    • @DigitalImageWorksVFX
      @DigitalImageWorksVFX Рік тому

      @@cgtinker By generated you mean this base_face_rig? Yes. I did a quick test: I assigned the same action from driver_rig to this base_face_rig and to new, default rigify rig. Results are the same: ua-cam.com/video/TLTramE-7QY/v-deo.html
      There is some movement in the eyes zone, but looks different. Rest of the face doesn't move.

    • @cgtinker
      @cgtinker  Рік тому

      ​@@DigitalImageWorksVFX It seems you are trying to animate a "not generated rig". In the vid, when you selected the rig which doesn't animate properly, you can see this "rigify button > re-generate rig" button.
      So in this case, you tried to transfer to a meta rig. I think this meta rig is from the driver rig. So well.. guess you got to create another face, or just a rigify humanoid rig (shift + a ...) then make sure to press the "generate rig" button and try transferring to the *generated* rig

    • @DigitalImageWorksVFX
      @DigitalImageWorksVFX Рік тому

      ​@@cgtinker YES! You are right! I was trying to transfer the action to metarig, not final rig. Now my character is finally smiling! Thank you for your support :)
      P.S.
      Have you ever thought about streaming data from the app to blender live? Seams to be interesting concept from other apps. Btw. your link to By me a coffee in the app doesn't work :( Error 404

    • @cgtinker
      @cgtinker  Рік тому +1

      ​@@DigitalImageWorksVFX you are welcome - glad it worked out :)
      I consider to implement a live link in the future. I've been focussing on BlendArMocap for a while though..
      I left buymeacoffee - I'm just on patreon at the moment but I consider to stop the donation thingy in the future.. thanks a lot though =)

  • @KohdyMcintyre
    @KohdyMcintyre 2 роки тому

    how do i import face data onto an existing model as this method did not work in my case

    • @KohdyMcintyre
      @KohdyMcintyre 2 роки тому

      Day 4 the rig only generates 1 eye and vanishes these MOCAP youtube tutorials are soooooooo bad non of them are usefull

  • @kellyvill9413
    @kellyvill9413 Рік тому

    Hi. Is this compatible with the Faceit plugin or the Autorig pro face?

  • @macitseferi1843
    @macitseferi1843 2 роки тому

    i have my own character and he already has rig on his face.. do i have to remove my own rig first ?

  • @ewgross8630
    @ewgross8630 2 роки тому

    thanks man

  • @vishvaspancholi5362
    @vishvaspancholi5362 3 роки тому +1

    The character I have has Quaternion Rotation vectors, is there a way I can convert the animation data from Euler to Quaternion?

    • @cgtinker
      @cgtinker  3 роки тому +1

      caution: the transfer only works to rigify face rigs.
      1. enable rigify, select the rig and go in pose mode.
      2. press n, in the rigify panel you can easily convert actions from euler to quaternion.

    • @vishvaspancholi5362
      @vishvaspancholi5362 3 роки тому +2

      @@cgtinker It worked! Thanks!

    • @vishvaspancholi5362
      @vishvaspancholi5362 3 роки тому +1

      @@cgtinker Also, just a suggestion, I don't know if it's possible or not but if you use mediapipe for creating face empties then it'd be very easy to use the add-on because mediapipe works with any webcam plus it can also track body and hands as well.

    • @cgtinker
      @cgtinker  3 роки тому +1

      ​@@vishvaspancholi5362 Thanks, I'll look into it in the future. It seems promising!
      At the moment I'd like to do a simulation tool first and take a lil break from ar - but I'll stay updated

    • @cgtinker
      @cgtinker  3 роки тому +2

      @@vishvaspancholi5362 made some tests with mediapipe, damn it's fun! maybe it's going to be the next project lol.

  • @kellyjohnson768
    @kellyjohnson768 Рік тому +1

    cg tinker, do you have to generate the face rig by using that method you did with the single bone, or can you just use the normal rigify rig you can generate?

  • @Japleen_Couture
    @Japleen_Couture Рік тому +1

    All the add-ons and stuff used are free?

    • @cgtinker
      @cgtinker  Рік тому +1

      yes, I only run patreon atm

  • @ollied2025
    @ollied2025 Рік тому

    Hi, i'm using a xiaomi mi6 and i can't install. play store says it isn't compatible with my device... any idea why?

    • @cgtinker
      @cgtinker  Рік тому

      some older devices are not supported by ArCore, google for ArCore supported devices and you'll find a list

    • @ollied2025
      @ollied2025 Рік тому

      @@cgtinker ah that would explain it then - thanks!

  • @zebcode
    @zebcode 2 роки тому

    Hey, I don't see Rigify buttons option. Only Rigify bone groups, Layer Names, and Generation. Thought the option might be under generation but it doesn't seem to be there?
    EDIT: I forgot to go into edit mode and delete the button but the option name seems to have also changed. It's now called Rigify Samples

  • @chandandutta5740
    @chandandutta5740 Рік тому

    Hi, It's a very nice tutorial. I am using McBook Pro, and it does not have NUM PAD, what would be the alternative option for me avoiding the NUM PAD? Thanks.

  • @1murkeybadmayn
    @1murkeybadmayn 3 роки тому +1

    sorry how do you save the driver rig so that I can import and use it in another project?

    • @cgtinker
      @cgtinker  3 роки тому +1

      After transfering just stash the animation in the action editor. Then you have a nla strip you can easily work with in other projects.

    • @1murkeybadmayn
      @1murkeybadmayn 3 роки тому

      @@cgtinker What i meant is to save the rig itself so i don't have to reconstruct it everytime i need to use it for a different animation.

    • @cgtinker
      @cgtinker  3 роки тому

      @@1murkeybadmayn not really sure if I understand the question.
      usually you rig a geometry bind it and use it as character. If you use a rigify rig while doing so, you should be able to transfer the animation to the characters face using the add-on.
      Animations on a rig can be stored on the character as nla strips (to reuse them). The usual workflow to do so is baking the animation and stashing the result.

    • @1murkeybadmayn
      @1murkeybadmayn 3 роки тому

      @@cgtinker No I meant to resue the rig itself not the animation. Beause I don't want to have to rebuild it for something else but nevermind. I have a bigger issue. I did the weightpainting thing but my neck is still deforming even after adding the support bones. I don't know what is going on. You also mentioned shift click the bones when weight painting, what does this mean because I cannot click on the bones when weight painting.

    • @1murkeybadmayn
      @1murkeybadmayn 3 роки тому

      @@cgtinker @cg tinker Does not matter. The weight painting did not work, the nla animation just disappears once you quit blender and reopen blender anyway, it does ot save. Also, after baking the rig and transfer of the animation, it just broke my model as half the face was separate from the model. The eyes and did not bind to the bone and the head separated from body of my model. I tried to follow every detail in the video, very frustrating for a whole week I was at this and still end up with nothing working. Overall, it was very hard to follow your video. The video is not in sync with the instructions in the video and it was too fast even when I played it at half speed and you skipped some bits when you were clicking, had no idea what you clicked to get certain things up. I think you should take time to do the demostration while talking not doing the audio and video separately. Would also help if it was slower and not skip parts in what you're doing so we can follow it.

  • @lt.facepalm9566
    @lt.facepalm9566 2 роки тому

    The android version is telling me my phone is not compatible with this version of the software, i'm using a fairly recent Redmagic 6 pro, help!

    • @cgtinker
      @cgtinker  2 роки тому +1

      ar core support is required, here is a list of ar core supported devices, cannot rly do anything about that :/
      developers.google.com/ar/devices

    • @lt.facepalm9566
      @lt.facepalm9566 2 роки тому

      @@cgtinker heya thanks for the quick reply!

  • @llYuki0okami
    @llYuki0okami Рік тому

    I cant get this app, i get "this app will not work on your device", no damn specifics

    • @cgtinker
      @cgtinker  Рік тому +1

      Most likely your device doesn't match the system require of your device. there are lists from google and apple for ArCore and ArKit supported devices respectively. ArCore or ArKit is required to run BlendArTrack

  • @sujeetop3780
    @sujeetop3780 2 роки тому

    Hey man❤️
    I tried to install dependencies ..but it's not installing ,how can I fix this

  • @SH-ck9sr
    @SH-ck9sr 2 роки тому +1

    Can you please add eye movement in this app

  • @bulba1995
    @bulba1995 2 роки тому +1

    Спасибо 👍

  • @willsculpts
    @willsculpts 2 роки тому

    thanks man ❤❤❤❤😍😍😍😍

  • @rohancooper4521
    @rohancooper4521 2 роки тому +1

    Hello! Thanks for this amazing tool and all the work you're doing for the community! Will this mocap work with the full body rigify rig on the Human Generator v3 humans? I am able to follow along until the last step but the action doesn't seem to do anything on the Human Generator Rigify rig.

    • @cgtinker
      @cgtinker  2 роки тому +1

      Hmm..maybe they use the new rigify face rig version. I plan to implement that in the future. Currently that's not supported.

  • @joshuamedina5541
    @joshuamedina5541 2 роки тому

    thank you so much

  • @asechannel6646
    @asechannel6646 2 роки тому +1

    Sir i can't download add on file..

    • @cgtinker
      @cgtinker  2 роки тому

      uhm why?

    • @asechannel6646
      @asechannel6646 2 роки тому

      Thanks sir i already download but on my retargeter app i can't see the file that i was record

    • @asechannel6646
      @asechannel6646 2 роки тому

      Sir how can i transfer my face tracker file to my computer

    • @cgtinker
      @cgtinker  2 роки тому +1

      @@asechannel6646 in the mobile app you will see a folder icon in the bottom right corner. pressing it will lead you to your recordings.
      to select a recording, just click the dot icon. then, press the share icon and you will get prompted with different ways to transfer the data. I usually e-mail them to me or save them in my documents folder.

    • @asechannel6646
      @asechannel6646 2 роки тому

      Thanks you so much god bless u

  • @agnoise
    @agnoise 2 роки тому

    this means it only works with faces_super_face armature? Can't I transfer the animation to a custom armature?

    • @cgtinker
      @cgtinker  2 роки тому

      it's supposed to be used with rigify (the metarig also uses the 'superfine')

  • @roborenderer
    @roborenderer 2 роки тому

    very helpful thanks

  • @asafilms
    @asafilms 5 місяців тому

    When the addon update and android app?

    • @cgtinker
      @cgtinker  4 місяці тому

      discontinued, not anytime soon sorry

  • @maiers24
    @maiers24 2 роки тому +1

    This is incredible. Is there any tips on what to do when you get an error after pressing generate facial rig?

    • @cgtinker
      @cgtinker  2 роки тому

      Whats the error message?

  • @legendearn8570
    @legendearn8570 2 роки тому

    can u please made a tutorial on how to apply it on Human Generator model.. thnks a lot for this amazing addon,
    shal b thankful if it there is body animation as well as facial..

  • @manuarias20
    @manuarias20 2 роки тому

    Thanks for the video and the info!. I have a question: when I apply the action to another facial rig, the only move the eyes, the other parts of the bones are still

    • @cgtinker
      @cgtinker  2 роки тому

      can u share the file with me?

    • @manuarias20
      @manuarias20 2 роки тому

      @@cgtinker Thanks for the response. I already solve it!. I was trying to apply it to the bones instead the rig. Thank you again for the tutorial and predisposition to answer the questions. Have a great day!

  • @BlenderAddict
    @BlenderAddict 2 роки тому

    Goodnight it is not comptable with Android 11 on m'y redmi note 11 nor my Huawei P10 lite on Android 8, why please tell me

    • @cgtinker
      @cgtinker  2 роки тому

      The app requires ar core support

  • @luolin3289
    @luolin3289 2 роки тому

    hi ! there's abolutely nothing under my rigify button. I enabled the plugin though. I cannot find the super face

    • @mrcolz9373
      @mrcolz9373 2 роки тому

      You have to be in edit mode

  • @AdrienLatapie
    @AdrienLatapie Рік тому

    Thank you for this tutorial!! BTW, after baking the Action and assigning it to my rig, only the eyes move, do you know what could be the issue? D:

    • @cgtinker
      @cgtinker  Рік тому

      I think you tried transferring to a non generated meta-rig. make sure to upgrade it

  • @sairuzzz
    @sairuzzz 2 роки тому

    Exellent addon! I have a problem with mouth animation on the "Generate Driver Rig step". Eyes animation works fine, but nose and mouth doesn't.

    • @cgtinker
      @cgtinker  2 роки тому

      What happens? I don't fully understand the issue.

    • @sairuzzz
      @sairuzzz 2 роки тому

      When i push generate driver rig button empties that situated in the eyes region captured but all other not. So eyes got the movement, lips end others empties doesn't

    • @sairuzzz
      @sairuzzz 2 роки тому

      I named armature 'rig' as you advised in previous comments also

    • @cgtinker
      @cgtinker  2 роки тому

      @@sairuzzz thanks.. hm.. guess the import data is from iOS?
      there shouldn't be renaming necessary. best is to just leave the names of driver and base rig by now. I think it's only an issue if your actual character face rig is named 'rig' as rigify always names generated rigs 'rig' which can lead to uses.
      I wasn't able to reproduce this, which blender version are you using? the empties contain a proper animation?

    • @sairuzzz
      @sairuzzz 2 роки тому

      @@cgtinker, i'm on the 3.1 ver. I consider myself as android user:), (mocap made by android app). Problem is that mocap empties from lips didn't respond to drive rig after i generate it

  • @ivanmontiel7349
    @ivanmontiel7349 2 роки тому

    Thanks!

  • @1murkeybadmayn
    @1murkeybadmayn 3 роки тому

    Hi there. Is there a way to just use the empties as a driver to an already rigged model so that we don't have to make the bones from scratch?

    • @cgtinker
      @cgtinker  3 роки тому +2

      working on another tool which will have drivers. but it isn't rdy yet.

    • @1murkeybadmayn
      @1murkeybadmayn 3 роки тому

      @@cgtinker Oh that would be great. Because I have a model which is already rigged with bones and cost me a lot too lol. Just a shame I cannot just copy the driver directly to it.

    • @mrcolz9373
      @mrcolz9373 2 роки тому

      @@cgtinker Any progress on this? I am in a similar predicament where I have a rigged model, and the rigify bit is possible but not efficient in terms of workflow. But great job on this addon, it's a life saver for short film makers.

  • @femisuccess124
    @femisuccess124 2 роки тому

    Can I use the blendar plugin with meshes gotten from doing planar track, like mocha pro track of the face and converted to empties, can I use the mesh track data to generate a rig in the blendar plugin? Or it only works for track data from the phone app?

    • @cgtinker
      @cgtinker  2 роки тому

      it's only made for the phone app (didn't even know mocha pro till now tbh :P)

  • @garywpearson1955
    @garywpearson1955 2 роки тому

    WOW!!!

  • @CarlosFuentesok
    @CarlosFuentesok 3 роки тому

    Can this be exported to unity? in fbx format.? I managed to understand this video along with another that you have and that takes steps that are faster here. I found that this does not stick to the character's bones. It only sticks to the mesh. So I ask if it can be exported for use in Unity

  • @jamesdickerson6726
    @jamesdickerson6726 2 роки тому

    That's fine and all, but how would you do it in real-time? It's not augmented reality if it's not in real-time. Motion capture and motion tracking are not the same thing.

  • @SuperRockcore
    @SuperRockcore 2 роки тому

    very interested in this.. complete noob though (rigged like 1 thing b4) :( I got to 3:18 but my bones are just rotating in pose mode.. anyway, I will try again as there aren't many options for this kind of thing...

    • @cgtinker
      @cgtinker  2 роки тому

      some bones can only be rotated, others can also be translate. make sure you are in pose mode, navigate to the armature tab and make sure you have activated the bone layers containing the ik bones. (sadly that's a step hard to describe without visuals, just google bone layers)
      in pose mode you can transform the bones. the "red" bones from rigify are usually IK drivers - those usually can be moved around freely. Yellow bones are usually only for rotations. The green ones are getting driven, so u can basically ignore them.

  • @Rocksetta
    @Rocksetta 2 роки тому

    This is great stuff. I would like to make a video about it. With new blender and your newest version I am doing fine making the face rig empties and generating a face rig. Also the action editor. and baking to the driver rig. I just seem to be lost with using the action editor to transfer the baked actions. Such a simple step but I seem to be missing something. Any suggestions.

    • @cgtinker
      @cgtinker  2 роки тому +1

      to transfer the action you need to use a generated rigify face rig. on the generated rigify face rig you can just select the action in the action editor (action are stored in the scene and don't depend on a certain object).
      Hope that helps :)
      Planning to make again a video about this topic as I have a small usability update in mind

    • @Rocksetta
      @Rocksetta 2 роки тому

      @@cgtinker I got it working. If I concentrate on full driver--> armature rig-->mesh I can get things working. When I just try to move the action from the driver to just the armature rig I can't do it. Anyway all good now. I will link to you on twitter if I get a video going.

  • @nilslemke7542
    @nilslemke7542 2 роки тому

    Hi, the way you transferred the animation didn't work for me, though I don't know why.. I created the action and put it on the control rig (which does work) but it doesn't move in sync with the driver rig, in fact it doesn't move at all. Would love any (total beginner friendly) tips. Thanks.

    • @cgtinker
      @cgtinker  2 роки тому

      It's supposed to be transferred to another generated rigify rig, not the control rig. The idea is to have a face rigged with the generated rigify face rig and transfer the animation of the driver rig to it

    • @nilslemke7542
      @nilslemke7542 2 роки тому

      @@cgtinker Yes, I get the idea. The driver rig has the animation that came from the video, which is then copied to the rig that controls the face, which is the control rig. (You gave it that name yourself in the video, hence I named it that way). So after copying the animation, the control rig and the driver rig should move in sync, yet my control rig doesn't move at all. Even though the animation data seems to be copied, as I can see it on the right side under "Animation".

    • @cgtinker
      @cgtinker  2 роки тому

      @@nilslemke7542 weird, can you send me the blend file at hello@cgtinker.com?
      I'll take a look. Probably something went wild. Preferably send it without any meshes (to reduce attachment size)

    • @NiLem98
      @NiLem98 2 роки тому

      @@cgtinker I realised that I was just stupid and didn't see you clicking "Bake To Action" because there was a cut right before it and it went so fast. 😅 So after that, it worked.

  • @famitory
    @famitory 2 роки тому

    any tips for transferring onto a rig that's not really very human-shaped?

    • @cgtinker
      @cgtinker  2 роки тому

      Try blendarmocap, it's better suited for more cartoonish characters. Will soon make a video on how to manipulate results