blendartrack - ar face tracking to rigify rig transfer
Вставка
- Опубліковано 11 лип 2024
- If you don't have blendartrack installed already, search for blendartrack in the app store to get the app or use the following links.
iOS:
apps.apple.com/us/app/blendar...
Android App:
play.google.com/store/apps/de...
The Add-On you grab at GitHub (requires Blender 2.9+):
github.com/cgtinker/blendartrack
Want to chat or need help? Join me on Discord! :)
/ discord
Want to support the project?
/ cgtinker
Timecodes
0:00 - Introduction
0:31 - Align Rigify Super Face
2:16 - Bind Mesh To Rig
4:22 - Setup and Import Tracking Data
5:51 - Transfer Animation Data
Don't know blendartrack? Checkout this video:
• blendartrack - ar moti... - Фільми й анімація
I probably already left a comment at some point, but wanted to say this addon is phenomenal. I really hope you keep updating because it's such has such a streamlined usage and is quite straightforward and effective
Usually, I don't write any comments but you are an incredible lifesaver (and mostly timesaver). I'm initiating myself to 3D modeling and animation, and as a beginner I was totally not seeing myself animate faces for hours (moreover probably for a poor result). It's really easy to use and the result is impressive. So sincerely thank you for sharing this !
And for anyone not understanding why the face rig does not move, it's absolutely not an issue (that's okay), it's because the face rig is useful for the driver rig generation, which this one, moves
(Precising to avoid someone else being stuck for nothing like me lmao)
thanks a lot, glad it enables you to move further :)
keep on creating!
Thank you, you have no idea the wonders this is doing for my production.
This is impressive! So the empty cloud generates (snaps to) a close rigify facial bone. This is an amazing improvement! Thank you for sharing this!
glad you like it! here what's happening:
- The global rigify super.face and local bones are getting aligned with the empties (doesn't take parent scale into account yet)
- The generated rigify rig drivers are getting constrained to the empties
At first I planned to make any rig snap to the empties, but after some testing I realised that this cannot work out. Usually many drivers are getting used when creating face rigs, that's why I went for the rigify face rig approach :)
this looks like a very smart way to go about rigging a face. I look forward to trying it out.
Thank you for sharing.
this is great. how many times now have i watched? thanks dude!🙌
That’s a great workaround I never thought of!
DAMN thanks a lot. This is one of the best add-ons on Blender for animation
Lovely tutorial, Mister.
Thanks.
This is so cool!
You just made life a bit easier and thanks for that😀
Man, I'm definitely going to support you on patreon once I have a little more money
Thank you for your kind teaching.:)
amazing addon, thank you so much !
Thank you very much, this is great, definitely have to buy it.
I'm still having fun with you app! Working on more cartoons. Thanx again!
bro this video perfection. Thankyou
Thanks bro you are life saver❤❤❤
it's good tutorial!Thank's man!
Thanks for making great add on it works fine and again thanking you for making it free because my type of bigner can afford that thanks for everything
Definitely glad this popped up on my feed. Because I need facial tracking because I am in process of making a feature film project and need this. ✊🏾
If U are just looking for face tracking, check out my mobile app blendartrack - it's better for facial animation at the moment
cheers mate!!What a legend!!
Thanks for this!!!
That is just amazing!
Thank you thank you thank you! I have been searching for how to do this particular thing for a month. It's going to save me so much time when making blendshapes. You have no idea. Thank you SO much
thanks man ❤❤❤❤😍😍😍😍
Thank you, Sir
thanks man
thank you so much
AMAZING!
very helpful thanks
Thanks!
awesome!
The teacher speaks very well.
Hey! I LOVE how this works and your tutorial! Unfortunately, the only action that works after copying the bake to the bones are the bones around the eyes. PLEASE HELP!
YYYY so cuuuuuuul
hallo! thanks for sharing! unfortunately the last link in your video description about blender track does not open. is there another video about how to use blender track?
thank you for the very informative video, does you know if deleting bones effects the blendartracker?
cg tinker, do you have to generate the face rig by using that method you did with the single bone, or can you just use the normal rigify rig you can generate?
Is there a way that you could include audio recording as well? Even in a simple format?
The reason why I ask is that I'm trying to animate a scene, and if the audio track and the animation track are synced, that'd be a time saver.
I just gave this a try and it works really well. I would recommend that after you've transferred the animation, you add another NLA strip and use that to further refine areas that might not track perfectly. So in the test I just did, the lips didn't go narrow enough with OH sounds, so I simply brought the corner controls in.
Is there a way to set the frame rate of imported motion? I typically work at 24 FPS rather than 60.
You should be able to scale the keyframes in the timeline on the x-axis you just have to find the appropriate amount to scale them down by. For example if you worked at 30fps you would scale them down so that they were halved. This video: ua-cam.com/video/4LnGFtGjk2E/v-deo.html talks about it more specifically. I've timestamped it to make it easy for you to find :-) I hope this helps!
@@Cripthulu very useful thank you
Can this be exported to unity? in fbx format.? I managed to understand this video along with another that you have and that takes steps that are faster here. I found that this does not stick to the character's bones. It only sticks to the mesh. So I ask if it can be exported for use in Unity
Great work! I want to know if this face animation could work with Readyplayerme avatars! Thanks.
Hello! Thanks for this amazing tool and all the work you're doing for the community! Will this mocap work with the full body rigify rig on the Human Generator v3 humans? I am able to follow along until the last step but the action doesn't seem to do anything on the Human Generator Rigify rig.
Hmm..maybe they use the new rigify face rig version. I plan to implement that in the future. Currently that's not supported.
Does this work in 3.3? When I try to import nothing happens. no error message, nothing.
I am a total beginner in character animation and mocap, and in Blender. I found your tutorial here a little too fast to follow but that's because I'm still new to so many of the basics but I've successfully made it work within Blender, but never transferring from the capture to a custom character. Where do you recommend getting started if I want to focus on learning this field of CG? (16 year VFX veteran of compositing and supervision, just trying to learn this) Thank you.
Hey Lincoln, I just recognised your comment - sorry for the late answer. how is it going?
For learning rigging, I can recommend cgdive. I've started to make some new tutorials about rigging and plan to update blendartrack soon to make the facial animation transfer easier.
In which fields are you interested? There are lots of great resources around :)
Hey! Awesome content. How come I've never seen you before?? I really love your BlendArMocap project. I was actually thinking of writing my Bachelor's thesis about this exact topic and try my own project haha. Anyway, I wanted to ask if there's any possibility to get the project file from this video, since I would love to try some things out. I completely understand if this isn't possible :) Thank you!
Thanks! The blendartrack facial animation sample file? Gotta check if it's still in my docs. Let me know your mail / way to send it to you if I find it.
If U mind sharing it here lmk at hello(at)cgtinker.com
Спасибо 👍
Hi, It's a very nice tutorial. I am using McBook Pro, and it does not have NUM PAD, what would be the alternative option for me avoiding the NUM PAD? Thanks.
Hi. Is this compatible with the Faceit plugin or the Autorig pro face?
Is it possible to make Blendartrack work with Humangenerator?
Can you do this with the new snow v2 character from blender that is fully rig I have tried but seem to be not working at all???
Hey, I don't see Rigify buttons option. Only Rigify bone groups, Layer Names, and Generation. Thought the option might be under generation but it doesn't seem to be there?
EDIT: I forgot to go into edit mode and delete the button but the option name seems to have also changed. It's now called Rigify Samples
I'm having an issue where after I do the whole process, only one of the bones in the character's face rig is moving. Do you have any idea why?
App not supported on newer android devices on the playstore. Is there a safe link to install it unofficially ?
Thank you for this tutorial!! BTW, after baking the Action and assigning it to my rig, only the eyes move, do you know what could be the issue? D:
I think you tried transferring to a non generated meta-rig. make sure to upgrade it
For some reason the camera does not switch to show the inner camera whenever i choose face tracking, it just shows the outer camera
WOW!!!
Hi ! I tried with the Blender 3.5 - Generate Rig doesn't work - python problem - do you know any issues with this version ?
I'm having trouble at 3:18. My mesh is not moving with the control_rig. I tried it twice. Made the control_rig a parent of the mesh. Tried changing positions of the armature within the face. No luck. Help...
Thanks for the video and the info!. I have a question: when I apply the action to another facial rig, the only move the eyes, the other parts of the bones are still
can u share the file with me?
@@cgtinker Thanks for the response. I already solve it!. I was trying to apply it to the bones instead the rig. Thank you again for the tutorial and predisposition to answer the questions. Have a great day!
Can't find the app anywhere anymore ): Tried both IOS and android devices
this may be a bit of noob question but how can we change the size of the animated face mesh and keep the animations? I need to size mines up in blender but the animations are set in keyframes when imported in theres alot of them. Is there a way to change size only on entire animation and keep everything else? Thanks
I think the easiest workflow is just to parent the mesh to an empty. then scale and transform the empty to your linkings :)
i have my own character and he already has rig on his face.. do i have to remove my own rig first ?
This is incredible. Is there any tips on what to do when you get an error after pressing generate facial rig?
Whats the error message?
Fantastic tutorial
Where i can find the Rigify Feature Set like the Super Face?
The list of available rig types appears in the Bone properties tab when the bone is selected in Pose Mode. Scroll down the Properties editor to find Rigify Type panel.
can u please made a tutorial on how to apply it on Human Generator model.. thnks a lot for this amazing addon,
shal b thankful if it there is body animation as well as facial..
Hey I tried setting this up and even after baking to action my mesh is not moving
Can this work with auto rig pro?
The jaw control fully moves the whole bottom of the neck as the jaw opens, I’m unsure how to fix it :(
Can you please show me how you setup the app : blandartack on Android which I could not. Please help me
Thankyouuu! Wil blink be supported in the futere?
Currently it's just available for iOS. Guess in a couple of years it will be available for Android too :)
Hey man❤️
I tried to install dependencies ..but it's not installing ,how can I fix this
Can I use the blendar plugin with meshes gotten from doing planar track, like mocha pro track of the face and converted to empties, can I use the mesh track data to generate a rig in the blendar plugin? Or it only works for track data from the phone app?
it's only made for the phone app (didn't even know mocha pro till now tbh :P)
This is great stuff. I would like to make a video about it. With new blender and your newest version I am doing fine making the face rig empties and generating a face rig. Also the action editor. and baking to the driver rig. I just seem to be lost with using the action editor to transfer the baked actions. Such a simple step but I seem to be missing something. Any suggestions.
to transfer the action you need to use a generated rigify face rig. on the generated rigify face rig you can just select the action in the action editor (action are stored in the scene and don't depend on a certain object).
Hope that helps :)
Planning to make again a video about this topic as I have a small usability update in mind
@@cgtinker I got it working. If I concentrate on full driver--> armature rig-->mesh I can get things working. When I just try to move the action from the driver to just the armature rig I can't do it. Anyway all good now. I will link to you on twitter if I get a video going.
Hey!! great tutorial and plug-in! Im having a bit for trouble when I get to the generate drive rig. When I have my empty with the animated gesture and then I align the face rig to it and hit generate driver rig, it makes something all scrambled. Is there any trouble shooting for this or has this issue ben ran into before? I would really appreciate help with a solution to this issue!!
Im having same issue aswell, i have followed every details on tutorial but things go crazy when i generate the drive rig.
it's important to make sure that the scale of the driver rig & the which should get animated is 1.
@@girlxsenpaigaming263 I can't even generate a face rig, it seems like in the video the parent locator he's talking about is called face_Motion_, but in mine the parent locator that moves all the empties is called 'cgt_HeadController'. There is a locator called Face_Motion_ but it doesn't seem to be affecting anything after you apply the translation and rotation. The capitalization is also different than the video which leads me to believe that he updated the code since then and there might be a bug? Even if I try to scale the generated face rig it stays scrambled.
Hello I was having the same issue! Then I realized I needed to change my input device to IOS and it worked!
Can we use this technique if the target face is non-humanoid? Like a talking horse or a dragon?
well, kinda, but you gotta create a rig for that and prolly calculated some data to get there. I don't think it's possible to automate something like this, shape keys are probably the closest but still, it requires user effort
this means it only works with faces_super_face armature? Can't I transfer the animation to a custom armature?
it's supposed to be used with rigify (the metarig also uses the 'superfine')
I seem to be having trouble with getting my action to apply to the rigify face rig. I am able to bake the action without error but when I select the rigify armature and click the action from the dropdown nothing seems to be transferred :/
Are you sure that you selected the same action? Did you check if the driver rig contains keyframes? Is the rigify rig you want to transfer too "generated" (doesn't work with just the meta rig)?
I had the same problem - I'd forgotten to do a Rigify Generate Rig on my destination face (at 02:30 in the video)
i had thought i had this problem, but what was actually happening is because i had to scale the transfer back down to scale 1 to get the bake to not mangle, the resulting movements were invisible on the larger scale rigged model
hi ! there's abolutely nothing under my rigify button. I enabled the plugin though. I cannot find the super face
You have to be in edit mode
Hi there. Is there a way to just use the empties as a driver to an already rigged model so that we don't have to make the bones from scratch?
working on another tool which will have drivers. but it isn't rdy yet.
@@cgtinker Oh that would be great. Because I have a model which is already rigged with bones and cost me a lot too lol. Just a shame I cannot just copy the driver directly to it.
@@cgtinker Any progress on this? I am in a similar predicament where I have a rigged model, and the rigify bit is possible but not efficient in terms of workflow. But great job on this addon, it's a life saver for short film makers.
Unfortunately not compatible with my device.
When i generate the driver rig and then transfer(bake) data to action, after selecting the action on to my main mesh it goes all weird like bones and jones goes abit crazy . Sorry hope that makes sense but wondering if im doing something wrong or? Im a total noob so please bare with me 😪
possibly you scaled up either of the rigs and didn't apply the scale?
That's fine and all, but how would you do it in real-time? It's not augmented reality if it's not in real-time. Motion capture and motion tracking are not the same thing.
All the add-ons and stuff used are free?
yes, I only run patreon atm
any tips for transferring onto a rig that's not really very human-shaped?
Try blendarmocap, it's better suited for more cartoonish characters. Will soon make a video on how to manipulate results
how do i import face data onto an existing model as this method did not work in my case
Day 4 the rig only generates 1 eye and vanishes these MOCAP youtube tutorials are soooooooo bad non of them are usefull
Hi I couldn’t get blendartrack to go selfie mode so I can’t track my face
are you using iOS or android?
iOS X+ is required for face tracking
The character I have has Quaternion Rotation vectors, is there a way I can convert the animation data from Euler to Quaternion?
caution: the transfer only works to rigify face rigs.
1. enable rigify, select the rig and go in pose mode.
2. press n, in the rigify panel you can easily convert actions from euler to quaternion.
@@cgtinker It worked! Thanks!
@@cgtinker Also, just a suggestion, I don't know if it's possible or not but if you use mediapipe for creating face empties then it'd be very easy to use the add-on because mediapipe works with any webcam plus it can also track body and hands as well.
@@vishvaspancholi5362 Thanks, I'll look into it in the future. It seems promising!
At the moment I'd like to do a simulation tool first and take a lil break from ar - but I'll stay updated
@@vishvaspancholi5362 made some tests with mediapipe, damn it's fun! maybe it's going to be the next project lol.
I can't understand how trasfer animation data to bones, like you have in beginning in video, I need to export this bones animation to UnrealEngine, but with control rig animation i cant... And i try 100 times to attach control rig to bones and never work for me even in different versions Blender
made the process easier in v2.2.0 ;)
@@cgtinker thanks your works insane
I guess you used global coordinates for the emptys rather than local? I have weird stretching bugs depending on oriention and direction of the generated control rig
the add-on is meant to be used in a specific way and currently is not a one click solution. I consider to make the process easier soon and update the tutorial.
In this tutorial, empties are in local space of the parent object which gets scaled. after generating the rig it's important to apply the scale, otherwise weird deformations occur on the generated rig in step 2.
@@cgtinker Thank you for getting back to me. After tinkering a while, I realized that the error was totally on my side. Your add-on is amazing!
Do shape keys help synchronize the many data points on the face for by example the eyelids? I get some weird deformations of the eye lids as the facial recognition does not perfectly capture blinking, and similar micro-movements.
But judging the add-on as a whole, you sir, have done an amazing job. Thank you so much for providing this wonderful too for us for free!
@@bifrostbeberast3246 glad you got it working! thanks :)
Hi, the way you transferred the animation didn't work for me, though I don't know why.. I created the action and put it on the control rig (which does work) but it doesn't move in sync with the driver rig, in fact it doesn't move at all. Would love any (total beginner friendly) tips. Thanks.
It's supposed to be transferred to another generated rigify rig, not the control rig. The idea is to have a face rigged with the generated rigify face rig and transfer the animation of the driver rig to it
@@cgtinker Yes, I get the idea. The driver rig has the animation that came from the video, which is then copied to the rig that controls the face, which is the control rig. (You gave it that name yourself in the video, hence I named it that way). So after copying the animation, the control rig and the driver rig should move in sync, yet my control rig doesn't move at all. Even though the animation data seems to be copied, as I can see it on the right side under "Animation".
@@nilslemke7542 weird, can you send me the blend file at hello@cgtinker.com?
I'll take a look. Probably something went wild. Preferably send it without any meshes (to reduce attachment size)
@@cgtinker I realised that I was just stupid and didn't see you clicking "Bake To Action" because there was a cut right before it and it went so fast. 😅 So after that, it worked.
Goodnight it is not comptable with Android 11 on m'y redmi note 11 nor my Huawei P10 lite on Android 8, why please tell me
The app requires ar core support
very interested in this.. complete noob though (rigged like 1 thing b4) :( I got to 3:18 but my bones are just rotating in pose mode.. anyway, I will try again as there aren't many options for this kind of thing...
some bones can only be rotated, others can also be translate. make sure you are in pose mode, navigate to the armature tab and make sure you have activated the bone layers containing the ik bones. (sadly that's a step hard to describe without visuals, just google bone layers)
in pose mode you can transform the bones. the "red" bones from rigify are usually IK drivers - those usually can be moved around freely. Yellow bones are usually only for rotations. The green ones are getting driven, so u can basically ignore them.
The android version is telling me my phone is not compatible with this version of the software, i'm using a fairly recent Redmagic 6 pro, help!
ar core support is required, here is a list of ar core supported devices, cannot rly do anything about that :/
developers.google.com/ar/devices
@@cgtinker heya thanks for the quick reply!
But what when i want to use mocap data with it. Normally mocapdata gives you a simple bonerig and normally your character has a simple bonerig since constrains have no sense when using mocap. However i tried to retarget from the bonerig to the generated rigify rig just to find and hughe list of bones, no way to search just scroll, pretty unusuable.
The only way i think this could work is using a bonerig for the entire body and just the driver rig for the face. But how toncombine both rigs in one character? Is it even possible to have 1 character parented to 2 rigs?
Yep it's possible. You can check out ' the add-on game rig tools' if U like. It's optimized for exactly this process. My add-on focusses only on the transfer to humanoid rigify rigs
@@cgtinker thx for the fast answer. Seems I overcomplicated the whole thing. (Take a sleep and going with fresh eyes to a problem works wonders^^) Found a streamlined workflow to incorporate blendartrack + mocap fusion + metahumans (ok getting metahumans to blender and get them working is horrendous).
Your app integrates flawless with mocap fusion, runs on android + you don't have to define 42 shapekeys in a character face to get it working. Unlike the iphone app.
Hi, i'm using a xiaomi mi6 and i can't install. play store says it isn't compatible with my device... any idea why?
some older devices are not supported by ArCore, google for ArCore supported devices and you'll find a list
@@cgtinker ah that would explain it then - thanks!
Hi!, thanks for share ur awesome work! i did it as the tutorial say, but when i have to generate a face rig to rig to animated emptys, the face rig appear all mixed up! and i did change the names first! i dont know really what it can be the problem, i tried everything, can you help me?
Did you select the device type or changed any bone names?
@@cgtinker i did not change any bone name, wich one is the device type?
oh!!, it was IOS not android!, i'm gonna try again. thanks!!
I cant get this app, i get "this app will not work on your device", no damn specifics
Most likely your device doesn't match the system require of your device. there are lists from google and apple for ArCore and ArKit supported devices respectively. ArCore or ArKit is required to run BlendArTrack
thank you so much for tuto, which app did you use to track face animation please?
blendartrack, check the links in the description ;)
sorry how do you save the driver rig so that I can import and use it in another project?
After transfering just stash the animation in the action editor. Then you have a nla strip you can easily work with in other projects.
@@cgtinker What i meant is to save the rig itself so i don't have to reconstruct it everytime i need to use it for a different animation.
@@1murkeybadmayn not really sure if I understand the question.
usually you rig a geometry bind it and use it as character. If you use a rigify rig while doing so, you should be able to transfer the animation to the characters face using the add-on.
Animations on a rig can be stored on the character as nla strips (to reuse them). The usual workflow to do so is baking the animation and stashing the result.
@@cgtinker No I meant to resue the rig itself not the animation. Beause I don't want to have to rebuild it for something else but nevermind. I have a bigger issue. I did the weightpainting thing but my neck is still deforming even after adding the support bones. I don't know what is going on. You also mentioned shift click the bones when weight painting, what does this mean because I cannot click on the bones when weight painting.
@@cgtinker @cg tinker Does not matter. The weight painting did not work, the nla animation just disappears once you quit blender and reopen blender anyway, it does ot save. Also, after baking the rig and transfer of the animation, it just broke my model as half the face was separate from the model. The eyes and did not bind to the bone and the head separated from body of my model. I tried to follow every detail in the video, very frustrating for a whole week I was at this and still end up with nothing working. Overall, it was very hard to follow your video. The video is not in sync with the instructions in the video and it was too fast even when I played it at half speed and you skipped some bits when you were clicking, had no idea what you clicked to get certain things up. I think you should take time to do the demostration while talking not doing the audio and video separately. Would also help if it was slower and not skip parts in what you're doing so we can follow it.