Usually, I don't write any comments but you are an incredible lifesaver (and mostly timesaver). I'm initiating myself to 3D modeling and animation, and as a beginner I was totally not seeing myself animate faces for hours (moreover probably for a poor result). It's really easy to use and the result is impressive. So sincerely thank you for sharing this ! And for anyone not understanding why the face rig does not move, it's absolutely not an issue (that's okay), it's because the face rig is useful for the driver rig generation, which this one, moves (Precising to avoid someone else being stuck for nothing like me lmao)
I probably already left a comment at some point, but wanted to say this addon is phenomenal. I really hope you keep updating because it's such has such a streamlined usage and is quite straightforward and effective
glad you like it! here what's happening: - The global rigify super.face and local bones are getting aligned with the empties (doesn't take parent scale into account yet) - The generated rigify rig drivers are getting constrained to the empties At first I planned to make any rig snap to the empties, but after some testing I realised that this cannot work out. Usually many drivers are getting used when creating face rigs, that's why I went for the rigify face rig approach :)
hallo! thanks for sharing! unfortunately the last link in your video description about blender track does not open. is there another video about how to use blender track?
Thank you thank you thank you! I have been searching for how to do this particular thing for a month. It's going to save me so much time when making blendshapes. You have no idea. Thank you SO much
Hey! I LOVE how this works and your tutorial! Unfortunately, the only action that works after copying the bake to the bones are the bones around the eyes. PLEASE HELP!
I'm having trouble at 3:18. My mesh is not moving with the control_rig. I tried it twice. Made the control_rig a parent of the mesh. Tried changing positions of the armature within the face. No luck. Help...
I seem to be having trouble with getting my action to apply to the rigify face rig. I am able to bake the action without error but when I select the rigify armature and click the action from the dropdown nothing seems to be transferred :/
Are you sure that you selected the same action? Did you check if the driver rig contains keyframes? Is the rigify rig you want to transfer too "generated" (doesn't work with just the meta rig)?
i had thought i had this problem, but what was actually happening is because i had to scale the transfer back down to scale 1 to get the bake to not mangle, the resulting movements were invisible on the larger scale rigged model
Is there a way that you could include audio recording as well? Even in a simple format? The reason why I ask is that I'm trying to animate a scene, and if the audio track and the animation track are synced, that'd be a time saver.
I just gave this a try and it works really well. I would recommend that after you've transferred the animation, you add another NLA strip and use that to further refine areas that might not track perfectly. So in the test I just did, the lips didn't go narrow enough with OH sounds, so I simply brought the corner controls in. Is there a way to set the frame rate of imported motion? I typically work at 24 FPS rather than 60.
You should be able to scale the keyframes in the timeline on the x-axis you just have to find the appropriate amount to scale them down by. For example if you worked at 30fps you would scale them down so that they were halved. This video: ua-cam.com/video/4LnGFtGjk2E/v-deo.html talks about it more specifically. I've timestamped it to make it easy for you to find :-) I hope this helps!
well, kinda, but you gotta create a rig for that and prolly calculated some data to get there. I don't think it's possible to automate something like this, shape keys are probably the closest but still, it requires user effort
The list of available rig types appears in the Bone properties tab when the bone is selected in Pose Mode. Scroll down the Properties editor to find Rigify Type panel.
I am a total beginner in character animation and mocap, and in Blender. I found your tutorial here a little too fast to follow but that's because I'm still new to so many of the basics but I've successfully made it work within Blender, but never transferring from the capture to a custom character. Where do you recommend getting started if I want to focus on learning this field of CG? (16 year VFX veteran of compositing and supervision, just trying to learn this) Thank you.
Hey Lincoln, I just recognised your comment - sorry for the late answer. how is it going? For learning rigging, I can recommend cgdive. I've started to make some new tutorials about rigging and plan to update blendartrack soon to make the facial animation transfer easier. In which fields are you interested? There are lots of great resources around :)
I can't understand how trasfer animation data to bones, like you have in beginning in video, I need to export this bones animation to UnrealEngine, but with control rig animation i cant... And i try 100 times to attach control rig to bones and never work for me even in different versions Blender
@@maiers24 this may happens because your diver rig name isn't 'rig' - the current version is using hard coded naming conventions (which I'll fix soon).
I did everything, step by step and movement won't transfer to rigify rig :/ For example: mouth don't move on it, only bones around the eyes (but in not the same way as on imported model). Scale is applied, I don't know what to do with it.
@@cgtinker By generated you mean this base_face_rig? Yes. I did a quick test: I assigned the same action from driver_rig to this base_face_rig and to new, default rigify rig. Results are the same: ua-cam.com/video/TLTramE-7QY/v-deo.html There is some movement in the eyes zone, but looks different. Rest of the face doesn't move.
@@DigitalImageWorksVFX It seems you are trying to animate a "not generated rig". In the vid, when you selected the rig which doesn't animate properly, you can see this "rigify button > re-generate rig" button. So in this case, you tried to transfer to a meta rig. I think this meta rig is from the driver rig. So well.. guess you got to create another face, or just a rigify humanoid rig (shift + a ...) then make sure to press the "generate rig" button and try transferring to the *generated* rig
@@cgtinker YES! You are right! I was trying to transfer the action to metarig, not final rig. Now my character is finally smiling! Thank you for your support :) P.S. Have you ever thought about streaming data from the app to blender live? Seams to be interesting concept from other apps. Btw. your link to By me a coffee in the app doesn't work :( Error 404
@@DigitalImageWorksVFX you are welcome - glad it worked out :) I consider to implement a live link in the future. I've been focussing on BlendArMocap for a while though.. I left buymeacoffee - I'm just on patreon at the moment but I consider to stop the donation thingy in the future.. thanks a lot though =)
caution: the transfer only works to rigify face rigs. 1. enable rigify, select the rig and go in pose mode. 2. press n, in the rigify panel you can easily convert actions from euler to quaternion.
@@cgtinker Also, just a suggestion, I don't know if it's possible or not but if you use mediapipe for creating face empties then it'd be very easy to use the add-on because mediapipe works with any webcam plus it can also track body and hands as well.
@@vishvaspancholi5362 Thanks, I'll look into it in the future. It seems promising! At the moment I'd like to do a simulation tool first and take a lil break from ar - but I'll stay updated
cg tinker, do you have to generate the face rig by using that method you did with the single bone, or can you just use the normal rigify rig you can generate?
Hey, I don't see Rigify buttons option. Only Rigify bone groups, Layer Names, and Generation. Thought the option might be under generation but it doesn't seem to be there? EDIT: I forgot to go into edit mode and delete the button but the option name seems to have also changed. It's now called Rigify Samples
Hi, It's a very nice tutorial. I am using McBook Pro, and it does not have NUM PAD, what would be the alternative option for me avoiding the NUM PAD? Thanks.
@@1murkeybadmayn not really sure if I understand the question. usually you rig a geometry bind it and use it as character. If you use a rigify rig while doing so, you should be able to transfer the animation to the characters face using the add-on. Animations on a rig can be stored on the character as nla strips (to reuse them). The usual workflow to do so is baking the animation and stashing the result.
@@cgtinker No I meant to resue the rig itself not the animation. Beause I don't want to have to rebuild it for something else but nevermind. I have a bigger issue. I did the weightpainting thing but my neck is still deforming even after adding the support bones. I don't know what is going on. You also mentioned shift click the bones when weight painting, what does this mean because I cannot click on the bones when weight painting.
@@cgtinker @cg tinker Does not matter. The weight painting did not work, the nla animation just disappears once you quit blender and reopen blender anyway, it does ot save. Also, after baking the rig and transfer of the animation, it just broke my model as half the face was separate from the model. The eyes and did not bind to the bone and the head separated from body of my model. I tried to follow every detail in the video, very frustrating for a whole week I was at this and still end up with nothing working. Overall, it was very hard to follow your video. The video is not in sync with the instructions in the video and it was too fast even when I played it at half speed and you skipped some bits when you were clicking, had no idea what you clicked to get certain things up. I think you should take time to do the demostration while talking not doing the audio and video separately. Would also help if it was slower and not skip parts in what you're doing so we can follow it.
Most likely your device doesn't match the system require of your device. there are lists from google and apple for ArCore and ArKit supported devices respectively. ArCore or ArKit is required to run BlendArTrack
Hello! Thanks for this amazing tool and all the work you're doing for the community! Will this mocap work with the full body rigify rig on the Human Generator v3 humans? I am able to follow along until the last step but the action doesn't seem to do anything on the Human Generator Rigify rig.
@@asechannel6646 in the mobile app you will see a folder icon in the bottom right corner. pressing it will lead you to your recordings. to select a recording, just click the dot icon. then, press the share icon and you will get prompted with different ways to transfer the data. I usually e-mail them to me or save them in my documents folder.
can u please made a tutorial on how to apply it on Human Generator model.. thnks a lot for this amazing addon, shal b thankful if it there is body animation as well as facial..
Thanks for the video and the info!. I have a question: when I apply the action to another facial rig, the only move the eyes, the other parts of the bones are still
@@cgtinker Thanks for the response. I already solve it!. I was trying to apply it to the bones instead the rig. Thank you again for the tutorial and predisposition to answer the questions. Have a great day!
When i push generate driver rig button empties that situated in the eyes region captured but all other not. So eyes got the movement, lips end others empties doesn't
@@sairuzzz thanks.. hm.. guess the import data is from iOS? there shouldn't be renaming necessary. best is to just leave the names of driver and base rig by now. I think it's only an issue if your actual character face rig is named 'rig' as rigify always names generated rigs 'rig' which can lead to uses. I wasn't able to reproduce this, which blender version are you using? the empties contain a proper animation?
@@cgtinker, i'm on the 3.1 ver. I consider myself as android user:), (mocap made by android app). Problem is that mocap empties from lips didn't respond to drive rig after i generate it
@@cgtinker Oh that would be great. Because I have a model which is already rigged with bones and cost me a lot too lol. Just a shame I cannot just copy the driver directly to it.
@@cgtinker Any progress on this? I am in a similar predicament where I have a rigged model, and the rigify bit is possible but not efficient in terms of workflow. But great job on this addon, it's a life saver for short film makers.
Can I use the blendar plugin with meshes gotten from doing planar track, like mocha pro track of the face and converted to empties, can I use the mesh track data to generate a rig in the blendar plugin? Or it only works for track data from the phone app?
Can this be exported to unity? in fbx format.? I managed to understand this video along with another that you have and that takes steps that are faster here. I found that this does not stick to the character's bones. It only sticks to the mesh. So I ask if it can be exported for use in Unity
That's fine and all, but how would you do it in real-time? It's not augmented reality if it's not in real-time. Motion capture and motion tracking are not the same thing.
very interested in this.. complete noob though (rigged like 1 thing b4) :( I got to 3:18 but my bones are just rotating in pose mode.. anyway, I will try again as there aren't many options for this kind of thing...
some bones can only be rotated, others can also be translate. make sure you are in pose mode, navigate to the armature tab and make sure you have activated the bone layers containing the ik bones. (sadly that's a step hard to describe without visuals, just google bone layers) in pose mode you can transform the bones. the "red" bones from rigify are usually IK drivers - those usually can be moved around freely. Yellow bones are usually only for rotations. The green ones are getting driven, so u can basically ignore them.
This is great stuff. I would like to make a video about it. With new blender and your newest version I am doing fine making the face rig empties and generating a face rig. Also the action editor. and baking to the driver rig. I just seem to be lost with using the action editor to transfer the baked actions. Such a simple step but I seem to be missing something. Any suggestions.
to transfer the action you need to use a generated rigify face rig. on the generated rigify face rig you can just select the action in the action editor (action are stored in the scene and don't depend on a certain object). Hope that helps :) Planning to make again a video about this topic as I have a small usability update in mind
@@cgtinker I got it working. If I concentrate on full driver--> armature rig-->mesh I can get things working. When I just try to move the action from the driver to just the armature rig I can't do it. Anyway all good now. I will link to you on twitter if I get a video going.
Hi, the way you transferred the animation didn't work for me, though I don't know why.. I created the action and put it on the control rig (which does work) but it doesn't move in sync with the driver rig, in fact it doesn't move at all. Would love any (total beginner friendly) tips. Thanks.
It's supposed to be transferred to another generated rigify rig, not the control rig. The idea is to have a face rigged with the generated rigify face rig and transfer the animation of the driver rig to it
@@cgtinker Yes, I get the idea. The driver rig has the animation that came from the video, which is then copied to the rig that controls the face, which is the control rig. (You gave it that name yourself in the video, hence I named it that way). So after copying the animation, the control rig and the driver rig should move in sync, yet my control rig doesn't move at all. Even though the animation data seems to be copied, as I can see it on the right side under "Animation".
@@nilslemke7542 weird, can you send me the blend file at hello@cgtinker.com? I'll take a look. Probably something went wild. Preferably send it without any meshes (to reduce attachment size)
@@cgtinker I realised that I was just stupid and didn't see you clicking "Bake To Action" because there was a cut right before it and it went so fast. 😅 So after that, it worked.
Usually, I don't write any comments but you are an incredible lifesaver (and mostly timesaver). I'm initiating myself to 3D modeling and animation, and as a beginner I was totally not seeing myself animate faces for hours (moreover probably for a poor result). It's really easy to use and the result is impressive. So sincerely thank you for sharing this !
And for anyone not understanding why the face rig does not move, it's absolutely not an issue (that's okay), it's because the face rig is useful for the driver rig generation, which this one, moves
(Precising to avoid someone else being stuck for nothing like me lmao)
thanks a lot, glad it enables you to move further :)
keep on creating!
I probably already left a comment at some point, but wanted to say this addon is phenomenal. I really hope you keep updating because it's such has such a streamlined usage and is quite straightforward and effective
This is impressive! So the empty cloud generates (snaps to) a close rigify facial bone. This is an amazing improvement! Thank you for sharing this!
glad you like it! here what's happening:
- The global rigify super.face and local bones are getting aligned with the empties (doesn't take parent scale into account yet)
- The generated rigify rig drivers are getting constrained to the empties
At first I planned to make any rig snap to the empties, but after some testing I realised that this cannot work out. Usually many drivers are getting used when creating face rigs, that's why I went for the rigify face rig approach :)
this is great. how many times now have i watched? thanks dude!🙌
The teacher speaks very well.
Lovely tutorial, Mister.
Thanks.
I'm still having fun with you app! Working on more cartoons. Thanx again!
hallo! thanks for sharing! unfortunately the last link in your video description about blender track does not open. is there another video about how to use blender track?
Thanks for making great add on it works fine and again thanking you for making it free because my type of bigner can afford that thanks for everything
this looks like a very smart way to go about rigging a face. I look forward to trying it out.
Thank you for sharing.
Man, I'm definitely going to support you on patreon once I have a little more money
Thank you, you have no idea the wonders this is doing for my production.
DAMN thanks a lot. This is one of the best add-ons on Blender for animation
That’s a great workaround I never thought of!
Thank you thank you thank you! I have been searching for how to do this particular thing for a month. It's going to save me so much time when making blendshapes. You have no idea. Thank you SO much
Is it possible to make Blendartrack work with Humangenerator?
bro this video perfection. Thankyou
Does this work in 3.3? When I try to import nothing happens. no error message, nothing.
App not supported on newer android devices on the playstore. Is there a safe link to install it unofficially ?
I'm having an issue where after I do the whole process, only one of the bones in the character's face rig is moving. Do you have any idea why?
Thank you for your kind teaching.:)
Definitely glad this popped up on my feed. Because I need facial tracking because I am in process of making a feature film project and need this. ✊🏾
If U are just looking for face tracking, check out my mobile app blendartrack - it's better for facial animation at the moment
Thank you very much, this is great, definitely have to buy it.
Hey! I LOVE how this works and your tutorial! Unfortunately, the only action that works after copying the bake to the bones are the bones around the eyes. PLEASE HELP!
I'm having trouble at 3:18. My mesh is not moving with the control_rig. I tried it twice. Made the control_rig a parent of the mesh. Tried changing positions of the armature within the face. No luck. Help...
Thank you, Sir
amazing addon, thank you so much !
Thanks for this!!!
I seem to be having trouble with getting my action to apply to the rigify face rig. I am able to bake the action without error but when I select the rigify armature and click the action from the dropdown nothing seems to be transferred :/
Are you sure that you selected the same action? Did you check if the driver rig contains keyframes? Is the rigify rig you want to transfer too "generated" (doesn't work with just the meta rig)?
I had the same problem - I'd forgotten to do a Rigify Generate Rig on my destination face (at 02:30 in the video)
i had thought i had this problem, but what was actually happening is because i had to scale the transfer back down to scale 1 to get the bake to not mangle, the resulting movements were invisible on the larger scale rigged model
Can't find the app anywhere anymore ): Tried both IOS and android devices
same ((
For some reason the camera does not switch to show the inner camera whenever i choose face tracking, it just shows the outer camera
This is so cool!
You just made life a bit easier and thanks for that😀
thank you for the very informative video, does you know if deleting bones effects the blendartracker?
it's good tutorial!Thank's man!
Can you do this with the new snow v2 character from blender that is fully rig I have tried but seem to be not working at all???
cheers mate!!What a legend!!
Hi I couldn’t get blendartrack to go selfie mode so I can’t track my face
are you using iOS or android?
iOS X+ is required for face tracking
Hey I tried setting this up and even after baking to action my mesh is not moving
That is just amazing!
Is there a way to apply face mocap data to a mesh with blend shapes?
A girl in a UA-cam blender video do a shark lowpoly charácter like vtuber, can u maybe do more of this facerigs?
The jaw control fully moves the whole bottom of the neck as the jaw opens, I’m unsure how to fix it :(
Is there a way that you could include audio recording as well? Even in a simple format?
The reason why I ask is that I'm trying to animate a scene, and if the audio track and the animation track are synced, that'd be a time saver.
I just gave this a try and it works really well. I would recommend that after you've transferred the animation, you add another NLA strip and use that to further refine areas that might not track perfectly. So in the test I just did, the lips didn't go narrow enough with OH sounds, so I simply brought the corner controls in.
Is there a way to set the frame rate of imported motion? I typically work at 24 FPS rather than 60.
You should be able to scale the keyframes in the timeline on the x-axis you just have to find the appropriate amount to scale them down by. For example if you worked at 30fps you would scale them down so that they were halved. This video: ua-cam.com/video/4LnGFtGjk2E/v-deo.html talks about it more specifically. I've timestamped it to make it easy for you to find :-) I hope this helps!
@@Cripthulu very useful thank you
Thankyouuu! Wil blink be supported in the futere?
Currently it's just available for iOS. Guess in a couple of years it will be available for Android too :)
awesome!
Thanks bro you are life saver❤❤❤
Can we use this technique if the target face is non-humanoid? Like a talking horse or a dragon?
well, kinda, but you gotta create a rig for that and prolly calculated some data to get there. I don't think it's possible to automate something like this, shape keys are probably the closest but still, it requires user effort
Fantastic tutorial
Where i can find the Rigify Feature Set like the Super Face?
The list of available rig types appears in the Bone properties tab when the bone is selected in Pose Mode. Scroll down the Properties editor to find Rigify Type panel.
Can you please show me how you setup the app : blandartack on Android which I could not. Please help me
Great work! I want to know if this face animation could work with Readyplayerme avatars! Thanks.
AMAZING!
I am a total beginner in character animation and mocap, and in Blender. I found your tutorial here a little too fast to follow but that's because I'm still new to so many of the basics but I've successfully made it work within Blender, but never transferring from the capture to a custom character. Where do you recommend getting started if I want to focus on learning this field of CG? (16 year VFX veteran of compositing and supervision, just trying to learn this) Thank you.
Hey Lincoln, I just recognised your comment - sorry for the late answer. how is it going?
For learning rigging, I can recommend cgdive. I've started to make some new tutorials about rigging and plan to update blendartrack soon to make the facial animation transfer easier.
In which fields are you interested? There are lots of great resources around :)
YYYY so cuuuuuuul
Hi ! I tried with the Blender 3.5 - Generate Rig doesn't work - python problem - do you know any issues with this version ?
I can't understand how trasfer animation data to bones, like you have in beginning in video, I need to export this bones animation to UnrealEngine, but with control rig animation i cant... And i try 100 times to attach control rig to bones and never work for me even in different versions Blender
made the process easier in v2.2.0 ;)
@@cgtinker thanks your works insane
Unfortunately not compatible with my device.
Can this work with auto rig pro?
Just tried this with Blender 3.1 and got an error when trying to generate the driver rig. Went back to Blender 3.0 and it worked OK.
Here's the error in full:
location: :-1
Error: Python: Traceback (most recent call last):
File "C:\Users\tim_r\AppData\Roaming\Blender Foundation\Blender\3.1\scripts\addons\blendartrack-main\src\interface\Operators.py", line 56, in execute
input_manager.generate_driver_rig()
File "C:\Users\tim_r\AppData\Roaming\Blender Foundation\Blender\3.1\scripts\addons\blendartrack-main\src\management\input_manager.py", line 68, in generate_driver_rig
rig = armature.get_armature("rig")
File "C:\Users\tim_r\AppData\Roaming\Blender Foundation\Blender\3.1\scripts\addons\blendartrack-main\src\utils\blend\armature.py", line 10, in get_armature
armature = bpy.data.objects[name]
KeyError: 'bpy_prop_collection[key]: key "rig" not found'
location: :-1
@@penniesonoureyes3436 I got the same error in blender 3.0 :(
@@maiers24 this may happens because your diver rig name isn't 'rig' - the current version is using hard coded naming conventions (which I'll fix soon).
I did everything, step by step and movement won't transfer to rigify rig :/ For example: mouth don't move on it, only bones around the eyes (but in not the same way as on imported model). Scale is applied, I don't know what to do with it.
Did you try transferring to a *generated* rigify rig?
@@cgtinker By generated you mean this base_face_rig? Yes. I did a quick test: I assigned the same action from driver_rig to this base_face_rig and to new, default rigify rig. Results are the same: ua-cam.com/video/TLTramE-7QY/v-deo.html
There is some movement in the eyes zone, but looks different. Rest of the face doesn't move.
@@DigitalImageWorksVFX It seems you are trying to animate a "not generated rig". In the vid, when you selected the rig which doesn't animate properly, you can see this "rigify button > re-generate rig" button.
So in this case, you tried to transfer to a meta rig. I think this meta rig is from the driver rig. So well.. guess you got to create another face, or just a rigify humanoid rig (shift + a ...) then make sure to press the "generate rig" button and try transferring to the *generated* rig
@@cgtinker YES! You are right! I was trying to transfer the action to metarig, not final rig. Now my character is finally smiling! Thank you for your support :)
P.S.
Have you ever thought about streaming data from the app to blender live? Seams to be interesting concept from other apps. Btw. your link to By me a coffee in the app doesn't work :( Error 404
@@DigitalImageWorksVFX you are welcome - glad it worked out :)
I consider to implement a live link in the future. I've been focussing on BlendArMocap for a while though..
I left buymeacoffee - I'm just on patreon at the moment but I consider to stop the donation thingy in the future.. thanks a lot though =)
how do i import face data onto an existing model as this method did not work in my case
Day 4 the rig only generates 1 eye and vanishes these MOCAP youtube tutorials are soooooooo bad non of them are usefull
Hi. Is this compatible with the Faceit plugin or the Autorig pro face?
i have my own character and he already has rig on his face.. do i have to remove my own rig first ?
thanks man
The character I have has Quaternion Rotation vectors, is there a way I can convert the animation data from Euler to Quaternion?
caution: the transfer only works to rigify face rigs.
1. enable rigify, select the rig and go in pose mode.
2. press n, in the rigify panel you can easily convert actions from euler to quaternion.
@@cgtinker It worked! Thanks!
@@cgtinker Also, just a suggestion, I don't know if it's possible or not but if you use mediapipe for creating face empties then it'd be very easy to use the add-on because mediapipe works with any webcam plus it can also track body and hands as well.
@@vishvaspancholi5362 Thanks, I'll look into it in the future. It seems promising!
At the moment I'd like to do a simulation tool first and take a lil break from ar - but I'll stay updated
@@vishvaspancholi5362 made some tests with mediapipe, damn it's fun! maybe it's going to be the next project lol.
cg tinker, do you have to generate the face rig by using that method you did with the single bone, or can you just use the normal rigify rig you can generate?
All the add-ons and stuff used are free?
yes, I only run patreon atm
Hi, i'm using a xiaomi mi6 and i can't install. play store says it isn't compatible with my device... any idea why?
some older devices are not supported by ArCore, google for ArCore supported devices and you'll find a list
@@cgtinker ah that would explain it then - thanks!
Hey, I don't see Rigify buttons option. Only Rigify bone groups, Layer Names, and Generation. Thought the option might be under generation but it doesn't seem to be there?
EDIT: I forgot to go into edit mode and delete the button but the option name seems to have also changed. It's now called Rigify Samples
Hi, It's a very nice tutorial. I am using McBook Pro, and it does not have NUM PAD, what would be the alternative option for me avoiding the NUM PAD? Thanks.
sorry how do you save the driver rig so that I can import and use it in another project?
After transfering just stash the animation in the action editor. Then you have a nla strip you can easily work with in other projects.
@@cgtinker What i meant is to save the rig itself so i don't have to reconstruct it everytime i need to use it for a different animation.
@@1murkeybadmayn not really sure if I understand the question.
usually you rig a geometry bind it and use it as character. If you use a rigify rig while doing so, you should be able to transfer the animation to the characters face using the add-on.
Animations on a rig can be stored on the character as nla strips (to reuse them). The usual workflow to do so is baking the animation and stashing the result.
@@cgtinker No I meant to resue the rig itself not the animation. Beause I don't want to have to rebuild it for something else but nevermind. I have a bigger issue. I did the weightpainting thing but my neck is still deforming even after adding the support bones. I don't know what is going on. You also mentioned shift click the bones when weight painting, what does this mean because I cannot click on the bones when weight painting.
@@cgtinker @cg tinker Does not matter. The weight painting did not work, the nla animation just disappears once you quit blender and reopen blender anyway, it does ot save. Also, after baking the rig and transfer of the animation, it just broke my model as half the face was separate from the model. The eyes and did not bind to the bone and the head separated from body of my model. I tried to follow every detail in the video, very frustrating for a whole week I was at this and still end up with nothing working. Overall, it was very hard to follow your video. The video is not in sync with the instructions in the video and it was too fast even when I played it at half speed and you skipped some bits when you were clicking, had no idea what you clicked to get certain things up. I think you should take time to do the demostration while talking not doing the audio and video separately. Would also help if it was slower and not skip parts in what you're doing so we can follow it.
The android version is telling me my phone is not compatible with this version of the software, i'm using a fairly recent Redmagic 6 pro, help!
ar core support is required, here is a list of ar core supported devices, cannot rly do anything about that :/
developers.google.com/ar/devices
@@cgtinker heya thanks for the quick reply!
I cant get this app, i get "this app will not work on your device", no damn specifics
Most likely your device doesn't match the system require of your device. there are lists from google and apple for ArCore and ArKit supported devices respectively. ArCore or ArKit is required to run BlendArTrack
Hey man❤️
I tried to install dependencies ..but it's not installing ,how can I fix this
Can you please add eye movement in this app
That would be way to hard, hundreds of movements a second.
Спасибо 👍
thanks man ❤❤❤❤😍😍😍😍
Hello! Thanks for this amazing tool and all the work you're doing for the community! Will this mocap work with the full body rigify rig on the Human Generator v3 humans? I am able to follow along until the last step but the action doesn't seem to do anything on the Human Generator Rigify rig.
Hmm..maybe they use the new rigify face rig version. I plan to implement that in the future. Currently that's not supported.
thank you so much
Sir i can't download add on file..
uhm why?
Thanks sir i already download but on my retargeter app i can't see the file that i was record
Sir how can i transfer my face tracker file to my computer
@@asechannel6646 in the mobile app you will see a folder icon in the bottom right corner. pressing it will lead you to your recordings.
to select a recording, just click the dot icon. then, press the share icon and you will get prompted with different ways to transfer the data. I usually e-mail them to me or save them in my documents folder.
Thanks you so much god bless u
this means it only works with faces_super_face armature? Can't I transfer the animation to a custom armature?
it's supposed to be used with rigify (the metarig also uses the 'superfine')
very helpful thanks
When the addon update and android app?
discontinued, not anytime soon sorry
This is incredible. Is there any tips on what to do when you get an error after pressing generate facial rig?
Whats the error message?
can u please made a tutorial on how to apply it on Human Generator model.. thnks a lot for this amazing addon,
shal b thankful if it there is body animation as well as facial..
Thanks for the video and the info!. I have a question: when I apply the action to another facial rig, the only move the eyes, the other parts of the bones are still
can u share the file with me?
@@cgtinker Thanks for the response. I already solve it!. I was trying to apply it to the bones instead the rig. Thank you again for the tutorial and predisposition to answer the questions. Have a great day!
Goodnight it is not comptable with Android 11 on m'y redmi note 11 nor my Huawei P10 lite on Android 8, why please tell me
The app requires ar core support
hi ! there's abolutely nothing under my rigify button. I enabled the plugin though. I cannot find the super face
You have to be in edit mode
Thank you for this tutorial!! BTW, after baking the Action and assigning it to my rig, only the eyes move, do you know what could be the issue? D:
I think you tried transferring to a non generated meta-rig. make sure to upgrade it
Exellent addon! I have a problem with mouth animation on the "Generate Driver Rig step". Eyes animation works fine, but nose and mouth doesn't.
What happens? I don't fully understand the issue.
When i push generate driver rig button empties that situated in the eyes region captured but all other not. So eyes got the movement, lips end others empties doesn't
I named armature 'rig' as you advised in previous comments also
@@sairuzzz thanks.. hm.. guess the import data is from iOS?
there shouldn't be renaming necessary. best is to just leave the names of driver and base rig by now. I think it's only an issue if your actual character face rig is named 'rig' as rigify always names generated rigs 'rig' which can lead to uses.
I wasn't able to reproduce this, which blender version are you using? the empties contain a proper animation?
@@cgtinker, i'm on the 3.1 ver. I consider myself as android user:), (mocap made by android app). Problem is that mocap empties from lips didn't respond to drive rig after i generate it
Thanks!
Hi there. Is there a way to just use the empties as a driver to an already rigged model so that we don't have to make the bones from scratch?
working on another tool which will have drivers. but it isn't rdy yet.
@@cgtinker Oh that would be great. Because I have a model which is already rigged with bones and cost me a lot too lol. Just a shame I cannot just copy the driver directly to it.
@@cgtinker Any progress on this? I am in a similar predicament where I have a rigged model, and the rigify bit is possible but not efficient in terms of workflow. But great job on this addon, it's a life saver for short film makers.
Can I use the blendar plugin with meshes gotten from doing planar track, like mocha pro track of the face and converted to empties, can I use the mesh track data to generate a rig in the blendar plugin? Or it only works for track data from the phone app?
it's only made for the phone app (didn't even know mocha pro till now tbh :P)
WOW!!!
Can this be exported to unity? in fbx format.? I managed to understand this video along with another that you have and that takes steps that are faster here. I found that this does not stick to the character's bones. It only sticks to the mesh. So I ask if it can be exported for use in Unity
That's fine and all, but how would you do it in real-time? It's not augmented reality if it's not in real-time. Motion capture and motion tracking are not the same thing.
very interested in this.. complete noob though (rigged like 1 thing b4) :( I got to 3:18 but my bones are just rotating in pose mode.. anyway, I will try again as there aren't many options for this kind of thing...
some bones can only be rotated, others can also be translate. make sure you are in pose mode, navigate to the armature tab and make sure you have activated the bone layers containing the ik bones. (sadly that's a step hard to describe without visuals, just google bone layers)
in pose mode you can transform the bones. the "red" bones from rigify are usually IK drivers - those usually can be moved around freely. Yellow bones are usually only for rotations. The green ones are getting driven, so u can basically ignore them.
This is great stuff. I would like to make a video about it. With new blender and your newest version I am doing fine making the face rig empties and generating a face rig. Also the action editor. and baking to the driver rig. I just seem to be lost with using the action editor to transfer the baked actions. Such a simple step but I seem to be missing something. Any suggestions.
to transfer the action you need to use a generated rigify face rig. on the generated rigify face rig you can just select the action in the action editor (action are stored in the scene and don't depend on a certain object).
Hope that helps :)
Planning to make again a video about this topic as I have a small usability update in mind
@@cgtinker I got it working. If I concentrate on full driver--> armature rig-->mesh I can get things working. When I just try to move the action from the driver to just the armature rig I can't do it. Anyway all good now. I will link to you on twitter if I get a video going.
Hi, the way you transferred the animation didn't work for me, though I don't know why.. I created the action and put it on the control rig (which does work) but it doesn't move in sync with the driver rig, in fact it doesn't move at all. Would love any (total beginner friendly) tips. Thanks.
It's supposed to be transferred to another generated rigify rig, not the control rig. The idea is to have a face rigged with the generated rigify face rig and transfer the animation of the driver rig to it
@@cgtinker Yes, I get the idea. The driver rig has the animation that came from the video, which is then copied to the rig that controls the face, which is the control rig. (You gave it that name yourself in the video, hence I named it that way). So after copying the animation, the control rig and the driver rig should move in sync, yet my control rig doesn't move at all. Even though the animation data seems to be copied, as I can see it on the right side under "Animation".
@@nilslemke7542 weird, can you send me the blend file at hello@cgtinker.com?
I'll take a look. Probably something went wild. Preferably send it without any meshes (to reduce attachment size)
@@cgtinker I realised that I was just stupid and didn't see you clicking "Bake To Action" because there was a cut right before it and it went so fast. 😅 So after that, it worked.
any tips for transferring onto a rig that's not really very human-shaped?
Try blendarmocap, it's better suited for more cartoonish characters. Will soon make a video on how to manipulate results