blendartrack - ar face tracking to rigify rig transfer

Поділитися
Вставка
  • Опубліковано 11 лип 2024
  • If you don't have blendartrack installed already, search for blendartrack in the app store to get the app or use the following links.
    iOS:
    apps.apple.com/us/app/blendar...
    Android App:
    play.google.com/store/apps/de...
    The Add-On you grab at GitHub (requires Blender 2.9+):
    github.com/cgtinker/blendartrack
    Want to chat or need help? Join me on Discord! :)
    / discord
    Want to support the project?
    / cgtinker
    Timecodes
    0:00 - Introduction
    0:31 - Align Rigify Super Face
    2:16 - Bind Mesh To Rig
    4:22 - Setup and Import Tracking Data
    5:51 - Transfer Animation Data
    Don't know blendartrack? Checkout this video:
    • blendartrack - ar moti...
  • Фільми й анімація

КОМЕНТАРІ • 226

  • @verticallucas
    @verticallucas Рік тому +1

    I probably already left a comment at some point, but wanted to say this addon is phenomenal. I really hope you keep updating because it's such has such a streamlined usage and is quite straightforward and effective

  • @NanoDraws
    @NanoDraws 2 роки тому +19

    Usually, I don't write any comments but you are an incredible lifesaver (and mostly timesaver). I'm initiating myself to 3D modeling and animation, and as a beginner I was totally not seeing myself animate faces for hours (moreover probably for a poor result). It's really easy to use and the result is impressive. So sincerely thank you for sharing this !
    And for anyone not understanding why the face rig does not move, it's absolutely not an issue (that's okay), it's because the face rig is useful for the driver rig generation, which this one, moves
    (Precising to avoid someone else being stuck for nothing like me lmao)

    • @cgtinker
      @cgtinker  2 роки тому

      thanks a lot, glad it enables you to move further :)
      keep on creating!

  • @F1dg3t
    @F1dg3t 2 роки тому

    Thank you, you have no idea the wonders this is doing for my production.

  • @activemotionpictures
    @activemotionpictures 2 роки тому +5

    This is impressive! So the empty cloud generates (snaps to) a close rigify facial bone. This is an amazing improvement! Thank you for sharing this!

    • @cgtinker
      @cgtinker  2 роки тому

      glad you like it! here what's happening:
      - The global rigify super.face and local bones are getting aligned with the empties (doesn't take parent scale into account yet)
      - The generated rigify rig drivers are getting constrained to the empties
      At first I planned to make any rig snap to the empties, but after some testing I realised that this cannot work out. Usually many drivers are getting used when creating face rigs, that's why I went for the rigify face rig approach :)

  • @jkneifl15
    @jkneifl15 2 роки тому

    this looks like a very smart way to go about rigging a face. I look forward to trying it out.
    Thank you for sharing.

  • @JadonLolley
    @JadonLolley Рік тому +1

    this is great. how many times now have i watched? thanks dude!🙌

  • @ShaneDotz
    @ShaneDotz 2 роки тому

    That’s a great workaround I never thought of!

  • @felixfokoua2184
    @felixfokoua2184 Рік тому

    DAMN thanks a lot. This is one of the best add-ons on Blender for animation

  • @Ochenter
    @Ochenter 2 роки тому +2

    Lovely tutorial, Mister.
    Thanks.

  • @dzagri4407
    @dzagri4407 Рік тому

    This is so cool!
    You just made life a bit easier and thanks for that😀

  • @3dApe
    @3dApe 2 роки тому

    Man, I'm definitely going to support you on patreon once I have a little more money

  • @user-sm1nm5tv1d
    @user-sm1nm5tv1d 2 роки тому

    Thank you for your kind teaching.:)

  • @gert-janakerboom1314
    @gert-janakerboom1314 Рік тому

    amazing addon, thank you so much !

  • @MikuDanceAnima
    @MikuDanceAnima 2 роки тому

    Thank you very much, this is great, definitely have to buy it.

  • @garywpearson1955
    @garywpearson1955 Рік тому

    I'm still having fun with you app! Working on more cartoons. Thanx again!

  • @walejaw3d
    @walejaw3d Рік тому

    bro this video perfection. Thankyou

  • @aadityarai7027
    @aadityarai7027 7 місяців тому

    Thanks bro you are life saver❤❤❤

  • @eldarra562
    @eldarra562 Рік тому

    it's good tutorial!Thank's man!

  • @DOCsHindi
    @DOCsHindi 11 місяців тому

    Thanks for making great add on it works fine and again thanking you for making it free because my type of bigner can afford that thanks for everything

  • @KwasiAnimationStudio
    @KwasiAnimationStudio 2 роки тому

    Definitely glad this popped up on my feed. Because I need facial tracking because I am in process of making a feature film project and need this. ✊🏾

    • @cgtinker
      @cgtinker  2 роки тому

      If U are just looking for face tracking, check out my mobile app blendartrack - it's better for facial animation at the moment

  • @lkgdmusic9634
    @lkgdmusic9634 2 роки тому

    cheers mate!!What a legend!!

  • @AutodidactAnimotions
    @AutodidactAnimotions 2 роки тому +1

    Thanks for this!!!

  • @alan112223
    @alan112223 2 роки тому

    That is just amazing!

  • @PriestessOfDada
    @PriestessOfDada 2 роки тому +4

    Thank you thank you thank you! I have been searching for how to do this particular thing for a month. It's going to save me so much time when making blendshapes. You have no idea. Thank you SO much

  • @willsculpts
    @willsculpts Рік тому

    thanks man ❤❤❤❤😍😍😍😍

  • @santoshgujar5237
    @santoshgujar5237 2 роки тому +1

    Thank you, Sir

  • @ewgross8630
    @ewgross8630 Рік тому

    thanks man

  • @joshuamedina5541
    @joshuamedina5541 2 роки тому

    thank you so much

  • @fihruhafid
    @fihruhafid Рік тому

    AMAZING!

  • @roborenderer
    @roborenderer 2 роки тому

    very helpful thanks

  • @ivanmontiel7349
    @ivanmontiel7349 Рік тому

    Thanks!

  • @Ollie_sm
    @Ollie_sm 2 роки тому +1

    awesome!

  • @user-tl1hh2fr1y
    @user-tl1hh2fr1y Рік тому

    The teacher speaks very well.

  • @jamesmcandrew7860
    @jamesmcandrew7860 Рік тому +2

    Hey! I LOVE how this works and your tutorial! Unfortunately, the only action that works after copying the bake to the bones are the bones around the eyes. PLEASE HELP!

  • @ParOk_Art
    @ParOk_Art Місяць тому

    YYYY so cuuuuuuul

  • @HarryMcKenzieTV
    @HarryMcKenzieTV 2 роки тому

    hallo! thanks for sharing! unfortunately the last link in your video description about blender track does not open. is there another video about how to use blender track?

  • @zUMERSAL
    @zUMERSAL Рік тому +1

    thank you for the very informative video, does you know if deleting bones effects the blendartracker?

  • @kellyjohnson768
    @kellyjohnson768 Рік тому +1

    cg tinker, do you have to generate the face rig by using that method you did with the single bone, or can you just use the normal rigify rig you can generate?

  • @dougrutledge532
    @dougrutledge532 11 місяців тому +1

    Is there a way that you could include audio recording as well? Even in a simple format?
    The reason why I ask is that I'm trying to animate a scene, and if the audio track and the animation track are synced, that'd be a time saver.

  • @AdrianParkinsonFilms
    @AdrianParkinsonFilms 2 роки тому +8

    I just gave this a try and it works really well. I would recommend that after you've transferred the animation, you add another NLA strip and use that to further refine areas that might not track perfectly. So in the test I just did, the lips didn't go narrow enough with OH sounds, so I simply brought the corner controls in.
    Is there a way to set the frame rate of imported motion? I typically work at 24 FPS rather than 60.

    • @Cripthulu
      @Cripthulu 2 роки тому +1

      You should be able to scale the keyframes in the timeline on the x-axis you just have to find the appropriate amount to scale them down by. For example if you worked at 30fps you would scale them down so that they were halved. This video: ua-cam.com/video/4LnGFtGjk2E/v-deo.html talks about it more specifically. I've timestamped it to make it easy for you to find :-) I hope this helps!

    • @oyentemaniatico
      @oyentemaniatico 2 роки тому

      @@Cripthulu very useful thank you

  • @CarlosFuentesok
    @CarlosFuentesok 2 роки тому

    Can this be exported to unity? in fbx format.? I managed to understand this video along with another that you have and that takes steps that are faster here. I found that this does not stick to the character's bones. It only sticks to the mesh. So I ask if it can be exported for use in Unity

  • @mikeben8769
    @mikeben8769 Рік тому

    Great work! I want to know if this face animation could work with Readyplayerme avatars! Thanks.

  • @rohancooper4521
    @rohancooper4521 2 роки тому +1

    Hello! Thanks for this amazing tool and all the work you're doing for the community! Will this mocap work with the full body rigify rig on the Human Generator v3 humans? I am able to follow along until the last step but the action doesn't seem to do anything on the Human Generator Rigify rig.

    • @cgtinker
      @cgtinker  2 роки тому +1

      Hmm..maybe they use the new rigify face rig version. I plan to implement that in the future. Currently that's not supported.

  • @3dpprofessor
    @3dpprofessor Рік тому

    Does this work in 3.3? When I try to import nothing happens. no error message, nothing.

  • @lincolnsmithtbp
    @lincolnsmithtbp 2 роки тому +8

    I am a total beginner in character animation and mocap, and in Blender. I found your tutorial here a little too fast to follow but that's because I'm still new to so many of the basics but I've successfully made it work within Blender, but never transferring from the capture to a custom character. Where do you recommend getting started if I want to focus on learning this field of CG? (16 year VFX veteran of compositing and supervision, just trying to learn this) Thank you.

    • @cgtinker
      @cgtinker  2 роки тому +4

      Hey Lincoln, I just recognised your comment - sorry for the late answer. how is it going?
      For learning rigging, I can recommend cgdive. I've started to make some new tutorials about rigging and plan to update blendartrack soon to make the facial animation transfer easier.
      In which fields are you interested? There are lots of great resources around :)

  • @MrPhili10
    @MrPhili10 2 роки тому +1

    Hey! Awesome content. How come I've never seen you before?? I really love your BlendArMocap project. I was actually thinking of writing my Bachelor's thesis about this exact topic and try my own project haha. Anyway, I wanted to ask if there's any possibility to get the project file from this video, since I would love to try some things out. I completely understand if this isn't possible :) Thank you!

    • @cgtinker
      @cgtinker  2 роки тому

      Thanks! The blendartrack facial animation sample file? Gotta check if it's still in my docs. Let me know your mail / way to send it to you if I find it.
      If U mind sharing it here lmk at hello(at)cgtinker.com

  • @bulba1995
    @bulba1995 2 роки тому +1

    Спасибо 👍

  • @chandandutta5740
    @chandandutta5740 Рік тому

    Hi, It's a very nice tutorial. I am using McBook Pro, and it does not have NUM PAD, what would be the alternative option for me avoiding the NUM PAD? Thanks.

  • @kellyvill9413
    @kellyvill9413 Рік тому

    Hi. Is this compatible with the Faceit plugin or the Autorig pro face?

  • @aion3232
    @aion3232 Рік тому +1

    Is it possible to make Blendartrack work with Humangenerator?

  • @MemeHeaven.0
    @MemeHeaven.0 Рік тому

    Can you do this with the new snow v2 character from blender that is fully rig I have tried but seem to be not working at all???

  • @zebcode
    @zebcode Рік тому

    Hey, I don't see Rigify buttons option. Only Rigify bone groups, Layer Names, and Generation. Thought the option might be under generation but it doesn't seem to be there?
    EDIT: I forgot to go into edit mode and delete the button but the option name seems to have also changed. It's now called Rigify Samples

  • @isaacwardmusic
    @isaacwardmusic 3 місяці тому +1

    I'm having an issue where after I do the whole process, only one of the bones in the character's face rig is moving. Do you have any idea why?

  • @rogoz8958
    @rogoz8958 5 місяців тому +1

    App not supported on newer android devices on the playstore. Is there a safe link to install it unofficially ?

  • @AdrienLatapie
    @AdrienLatapie Рік тому

    Thank you for this tutorial!! BTW, after baking the Action and assigning it to my rig, only the eyes move, do you know what could be the issue? D:

    • @cgtinker
      @cgtinker  Рік тому

      I think you tried transferring to a non generated meta-rig. make sure to upgrade it

  • @Sjostrom2001
    @Sjostrom2001 10 місяців тому

    For some reason the camera does not switch to show the inner camera whenever i choose face tracking, it just shows the outer camera

  • @garywpearson1955
    @garywpearson1955 Рік тому

    WOW!!!

  • @rebelllion8853
    @rebelllion8853 Рік тому

    Hi ! I tried with the Blender 3.5 - Generate Rig doesn't work - python problem - do you know any issues with this version ?

  • @eb-
    @eb- Рік тому

    I'm having trouble at 3:18. My mesh is not moving with the control_rig. I tried it twice. Made the control_rig a parent of the mesh. Tried changing positions of the armature within the face. No luck. Help...

  • @manuarias20
    @manuarias20 2 роки тому

    Thanks for the video and the info!. I have a question: when I apply the action to another facial rig, the only move the eyes, the other parts of the bones are still

    • @cgtinker
      @cgtinker  2 роки тому

      can u share the file with me?

    • @manuarias20
      @manuarias20 2 роки тому

      @@cgtinker Thanks for the response. I already solve it!. I was trying to apply it to the bones instead the rig. Thank you again for the tutorial and predisposition to answer the questions. Have a great day!

  • @zoobloo
    @zoobloo 3 місяці тому +1

    Can't find the app anywhere anymore ): Tried both IOS and android devices

  • @naytbreeze
    @naytbreeze 2 роки тому

    this may be a bit of noob question but how can we change the size of the animated face mesh and keep the animations? I need to size mines up in blender but the animations are set in keyframes when imported in theres alot of them. Is there a way to change size only on entire animation and keep everything else? Thanks

    • @cgtinker
      @cgtinker  2 роки тому

      I think the easiest workflow is just to parent the mesh to an empty. then scale and transform the empty to your linkings :)

  • @macitseferi1843
    @macitseferi1843 2 роки тому

    i have my own character and he already has rig on his face.. do i have to remove my own rig first ?

  • @maiers24
    @maiers24 2 роки тому +1

    This is incredible. Is there any tips on what to do when you get an error after pressing generate facial rig?

    • @cgtinker
      @cgtinker  2 роки тому

      Whats the error message?

  • @domenicomastandrea7981
    @domenicomastandrea7981 Рік тому

    Fantastic tutorial
    Where i can find the Rigify Feature Set like the Super Face?

    • @cgtinker
      @cgtinker  Рік тому

      The list of available rig types appears in the Bone properties tab when the bone is selected in Pose Mode. Scroll down the Properties editor to find Rigify Type panel.

  • @legendearn8570
    @legendearn8570 Рік тому

    can u please made a tutorial on how to apply it on Human Generator model.. thnks a lot for this amazing addon,
    shal b thankful if it there is body animation as well as facial..

  • @scrambles1230
    @scrambles1230 Рік тому

    Hey I tried setting this up and even after baking to action my mesh is not moving

  • @NamastayGangstaArt
    @NamastayGangstaArt 2 роки тому +1

    Can this work with auto rig pro?

  • @doodledog5080
    @doodledog5080 10 місяців тому

    The jaw control fully moves the whole bottom of the neck as the jaw opens, I’m unsure how to fix it :(

  • @shivangipriya4153
    @shivangipriya4153 Рік тому

    Can you please show me how you setup the app : blandartack on Android which I could not. Please help me

  • @KaasTVNL
    @KaasTVNL 2 роки тому +1

    Thankyouuu! Wil blink be supported in the futere?

    • @cgtinker
      @cgtinker  2 роки тому +2

      Currently it's just available for iOS. Guess in a couple of years it will be available for Android too :)

  • @sujeetop3780
    @sujeetop3780 2 роки тому

    Hey man❤️
    I tried to install dependencies ..but it's not installing ,how can I fix this

  • @femisuccess124
    @femisuccess124 Рік тому

    Can I use the blendar plugin with meshes gotten from doing planar track, like mocha pro track of the face and converted to empties, can I use the mesh track data to generate a rig in the blendar plugin? Or it only works for track data from the phone app?

    • @cgtinker
      @cgtinker  Рік тому

      it's only made for the phone app (didn't even know mocha pro till now tbh :P)

  • @Rocksetta
    @Rocksetta 2 роки тому

    This is great stuff. I would like to make a video about it. With new blender and your newest version I am doing fine making the face rig empties and generating a face rig. Also the action editor. and baking to the driver rig. I just seem to be lost with using the action editor to transfer the baked actions. Such a simple step but I seem to be missing something. Any suggestions.

    • @cgtinker
      @cgtinker  2 роки тому +1

      to transfer the action you need to use a generated rigify face rig. on the generated rigify face rig you can just select the action in the action editor (action are stored in the scene and don't depend on a certain object).
      Hope that helps :)
      Planning to make again a video about this topic as I have a small usability update in mind

    • @Rocksetta
      @Rocksetta 2 роки тому

      @@cgtinker I got it working. If I concentrate on full driver--> armature rig-->mesh I can get things working. When I just try to move the action from the driver to just the armature rig I can't do it. Anyway all good now. I will link to you on twitter if I get a video going.

  • @antoinecruse7399
    @antoinecruse7399 2 роки тому +1

    Hey!! great tutorial and plug-in! Im having a bit for trouble when I get to the generate drive rig. When I have my empty with the animated gesture and then I align the face rig to it and hit generate driver rig, it makes something all scrambled. Is there any trouble shooting for this or has this issue ben ran into before? I would really appreciate help with a solution to this issue!!

    • @girlxsenpaigaming263
      @girlxsenpaigaming263 2 роки тому

      Im having same issue aswell, i have followed every details on tutorial but things go crazy when i generate the drive rig.

    • @cgtinker
      @cgtinker  2 роки тому +1

      it's important to make sure that the scale of the driver rig & the which should get animated is 1.

    • @tabby842
      @tabby842 2 роки тому

      @@girlxsenpaigaming263 I can't even generate a face rig, it seems like in the video the parent locator he's talking about is called face_Motion_, but in mine the parent locator that moves all the empties is called 'cgt_HeadController'. There is a locator called Face_Motion_ but it doesn't seem to be affecting anything after you apply the translation and rotation. The capitalization is also different than the video which leads me to believe that he updated the code since then and there might be a bug? Even if I try to scale the generated face rig it stays scrambled.

    • @ItsKoye
      @ItsKoye 2 роки тому +1

      Hello I was having the same issue! Then I realized I needed to change my input device to IOS and it worked!

  • @michaeltyers7336
    @michaeltyers7336 Рік тому

    Can we use this technique if the target face is non-humanoid? Like a talking horse or a dragon?

    • @cgtinker
      @cgtinker  Рік тому +1

      well, kinda, but you gotta create a rig for that and prolly calculated some data to get there. I don't think it's possible to automate something like this, shape keys are probably the closest but still, it requires user effort

  • @agnoise
    @agnoise 2 роки тому

    this means it only works with faces_super_face armature? Can't I transfer the animation to a custom armature?

    • @cgtinker
      @cgtinker  2 роки тому

      it's supposed to be used with rigify (the metarig also uses the 'superfine')

  • @dwassortedmedia
    @dwassortedmedia 2 роки тому +3

    I seem to be having trouble with getting my action to apply to the rigify face rig. I am able to bake the action without error but when I select the rigify armature and click the action from the dropdown nothing seems to be transferred :/

    • @cgtinker
      @cgtinker  2 роки тому +1

      Are you sure that you selected the same action? Did you check if the driver rig contains keyframes? Is the rigify rig you want to transfer too "generated" (doesn't work with just the meta rig)?

    • @penniesonoureyes3436
      @penniesonoureyes3436 2 роки тому

      I had the same problem - I'd forgotten to do a Rigify Generate Rig on my destination face (at 02:30 in the video)

    • @famitory
      @famitory 2 роки тому

      i had thought i had this problem, but what was actually happening is because i had to scale the transfer back down to scale 1 to get the bake to not mangle, the resulting movements were invisible on the larger scale rigged model

  • @luolin3289
    @luolin3289 2 роки тому

    hi ! there's abolutely nothing under my rigify button. I enabled the plugin though. I cannot find the super face

    • @mrcolz9373
      @mrcolz9373 2 роки тому

      You have to be in edit mode

  • @1murkeybadmayn
    @1murkeybadmayn 2 роки тому

    Hi there. Is there a way to just use the empties as a driver to an already rigged model so that we don't have to make the bones from scratch?

    • @cgtinker
      @cgtinker  2 роки тому +2

      working on another tool which will have drivers. but it isn't rdy yet.

    • @1murkeybadmayn
      @1murkeybadmayn 2 роки тому

      @@cgtinker Oh that would be great. Because I have a model which is already rigged with bones and cost me a lot too lol. Just a shame I cannot just copy the driver directly to it.

    • @mrcolz9373
      @mrcolz9373 2 роки тому

      @@cgtinker Any progress on this? I am in a similar predicament where I have a rigged model, and the rigify bit is possible but not efficient in terms of workflow. But great job on this addon, it's a life saver for short film makers.

  • @SHA3DOW_
    @SHA3DOW_ 2 роки тому

    Unfortunately not compatible with my device.

  • @girlxsenpaigaming263
    @girlxsenpaigaming263 2 роки тому

    When i generate the driver rig and then transfer(bake) data to action, after selecting the action on to my main mesh it goes all weird like bones and jones goes abit crazy . Sorry hope that makes sense but wondering if im doing something wrong or? Im a total noob so please bare with me 😪

    • @cgtinker
      @cgtinker  2 роки тому

      possibly you scaled up either of the rigs and didn't apply the scale?

  • @jamesdickerson6726
    @jamesdickerson6726 2 роки тому

    That's fine and all, but how would you do it in real-time? It's not augmented reality if it's not in real-time. Motion capture and motion tracking are not the same thing.

  • @Japleen_Couture
    @Japleen_Couture Рік тому +1

    All the add-ons and stuff used are free?

    • @cgtinker
      @cgtinker  Рік тому +1

      yes, I only run patreon atm

  • @famitory
    @famitory 2 роки тому

    any tips for transferring onto a rig that's not really very human-shaped?

    • @cgtinker
      @cgtinker  2 роки тому

      Try blendarmocap, it's better suited for more cartoonish characters. Will soon make a video on how to manipulate results

  • @BOSSposes
    @BOSSposes 2 роки тому

    how do i import face data onto an existing model as this method did not work in my case

    • @BOSSposes
      @BOSSposes Рік тому

      Day 4 the rig only generates 1 eye and vanishes these MOCAP youtube tutorials are soooooooo bad non of them are usefull

  • @hingakoroma2071
    @hingakoroma2071 2 роки тому +1

    Hi I couldn’t get blendartrack to go selfie mode so I can’t track my face

    • @cgtinker
      @cgtinker  2 роки тому

      are you using iOS or android?
      iOS X+ is required for face tracking

  • @vishvaspancholi5362
    @vishvaspancholi5362 2 роки тому +1

    The character I have has Quaternion Rotation vectors, is there a way I can convert the animation data from Euler to Quaternion?

    • @cgtinker
      @cgtinker  2 роки тому +1

      caution: the transfer only works to rigify face rigs.
      1. enable rigify, select the rig and go in pose mode.
      2. press n, in the rigify panel you can easily convert actions from euler to quaternion.

    • @vishvaspancholi5362
      @vishvaspancholi5362 2 роки тому +2

      @@cgtinker It worked! Thanks!

    • @vishvaspancholi5362
      @vishvaspancholi5362 2 роки тому +1

      @@cgtinker Also, just a suggestion, I don't know if it's possible or not but if you use mediapipe for creating face empties then it'd be very easy to use the add-on because mediapipe works with any webcam plus it can also track body and hands as well.

    • @cgtinker
      @cgtinker  2 роки тому +1

      ​@@vishvaspancholi5362 Thanks, I'll look into it in the future. It seems promising!
      At the moment I'd like to do a simulation tool first and take a lil break from ar - but I'll stay updated

    • @cgtinker
      @cgtinker  2 роки тому +2

      @@vishvaspancholi5362 made some tests with mediapipe, damn it's fun! maybe it's going to be the next project lol.

  • @user-tc1et8rm8f
    @user-tc1et8rm8f Рік тому

    I can't understand how trasfer animation data to bones, like you have in beginning in video, I need to export this bones animation to UnrealEngine, but with control rig animation i cant... And i try 100 times to attach control rig to bones and never work for me even in different versions Blender

    • @cgtinker
      @cgtinker  Рік тому +1

      made the process easier in v2.2.0 ;)

    • @user-tc1et8rm8f
      @user-tc1et8rm8f Рік тому

      @@cgtinker thanks your works insane

  • @bifrostbeberast3246
    @bifrostbeberast3246 2 роки тому

    I guess you used global coordinates for the emptys rather than local? I have weird stretching bugs depending on oriention and direction of the generated control rig

    • @cgtinker
      @cgtinker  2 роки тому

      the add-on is meant to be used in a specific way and currently is not a one click solution. I consider to make the process easier soon and update the tutorial.
      In this tutorial, empties are in local space of the parent object which gets scaled. after generating the rig it's important to apply the scale, otherwise weird deformations occur on the generated rig in step 2.

    • @bifrostbeberast3246
      @bifrostbeberast3246 2 роки тому

      @@cgtinker Thank you for getting back to me. After tinkering a while, I realized that the error was totally on my side. Your add-on is amazing!
      Do shape keys help synchronize the many data points on the face for by example the eyelids? I get some weird deformations of the eye lids as the facial recognition does not perfectly capture blinking, and similar micro-movements.
      But judging the add-on as a whole, you sir, have done an amazing job. Thank you so much for providing this wonderful too for us for free!

    • @cgtinker
      @cgtinker  2 роки тому

      @@bifrostbeberast3246 glad you got it working! thanks :)

  • @nilslemke7542
    @nilslemke7542 2 роки тому

    Hi, the way you transferred the animation didn't work for me, though I don't know why.. I created the action and put it on the control rig (which does work) but it doesn't move in sync with the driver rig, in fact it doesn't move at all. Would love any (total beginner friendly) tips. Thanks.

    • @cgtinker
      @cgtinker  2 роки тому

      It's supposed to be transferred to another generated rigify rig, not the control rig. The idea is to have a face rigged with the generated rigify face rig and transfer the animation of the driver rig to it

    • @nilslemke7542
      @nilslemke7542 2 роки тому

      @@cgtinker Yes, I get the idea. The driver rig has the animation that came from the video, which is then copied to the rig that controls the face, which is the control rig. (You gave it that name yourself in the video, hence I named it that way). So after copying the animation, the control rig and the driver rig should move in sync, yet my control rig doesn't move at all. Even though the animation data seems to be copied, as I can see it on the right side under "Animation".

    • @cgtinker
      @cgtinker  2 роки тому

      @@nilslemke7542 weird, can you send me the blend file at hello@cgtinker.com?
      I'll take a look. Probably something went wild. Preferably send it without any meshes (to reduce attachment size)

    • @NiLem98
      @NiLem98 2 роки тому

      @@cgtinker I realised that I was just stupid and didn't see you clicking "Bake To Action" because there was a cut right before it and it went so fast. 😅 So after that, it worked.

  • @yvesjannic6695
    @yvesjannic6695 2 роки тому

    Goodnight it is not comptable with Android 11 on m'y redmi note 11 nor my Huawei P10 lite on Android 8, why please tell me

    • @cgtinker
      @cgtinker  2 роки тому

      The app requires ar core support

  • @SuperRockcore
    @SuperRockcore 2 роки тому

    very interested in this.. complete noob though (rigged like 1 thing b4) :( I got to 3:18 but my bones are just rotating in pose mode.. anyway, I will try again as there aren't many options for this kind of thing...

    • @cgtinker
      @cgtinker  2 роки тому

      some bones can only be rotated, others can also be translate. make sure you are in pose mode, navigate to the armature tab and make sure you have activated the bone layers containing the ik bones. (sadly that's a step hard to describe without visuals, just google bone layers)
      in pose mode you can transform the bones. the "red" bones from rigify are usually IK drivers - those usually can be moved around freely. Yellow bones are usually only for rotations. The green ones are getting driven, so u can basically ignore them.

  • @lt.facepalm9566
    @lt.facepalm9566 2 роки тому

    The android version is telling me my phone is not compatible with this version of the software, i'm using a fairly recent Redmagic 6 pro, help!

    • @cgtinker
      @cgtinker  2 роки тому +1

      ar core support is required, here is a list of ar core supported devices, cannot rly do anything about that :/
      developers.google.com/ar/devices

    • @lt.facepalm9566
      @lt.facepalm9566 2 роки тому

      @@cgtinker heya thanks for the quick reply!

  • @Frigus3D-Art
    @Frigus3D-Art 2 роки тому

    But what when i want to use mocap data with it. Normally mocapdata gives you a simple bonerig and normally your character has a simple bonerig since constrains have no sense when using mocap. However i tried to retarget from the bonerig to the generated rigify rig just to find and hughe list of bones, no way to search just scroll, pretty unusuable.
    The only way i think this could work is using a bonerig for the entire body and just the driver rig for the face. But how toncombine both rigs in one character? Is it even possible to have 1 character parented to 2 rigs?

    • @cgtinker
      @cgtinker  2 роки тому +1

      Yep it's possible. You can check out ' the add-on game rig tools' if U like. It's optimized for exactly this process. My add-on focusses only on the transfer to humanoid rigify rigs

    • @Frigus3D-Art
      @Frigus3D-Art 2 роки тому

      @@cgtinker thx for the fast answer. Seems I overcomplicated the whole thing. (Take a sleep and going with fresh eyes to a problem works wonders^^) Found a streamlined workflow to incorporate blendartrack + mocap fusion + metahumans (ok getting metahumans to blender and get them working is horrendous).
      Your app integrates flawless with mocap fusion, runs on android + you don't have to define 42 shapekeys in a character face to get it working. Unlike the iphone app.

  • @ollied2025
    @ollied2025 Рік тому

    Hi, i'm using a xiaomi mi6 and i can't install. play store says it isn't compatible with my device... any idea why?

    • @cgtinker
      @cgtinker  Рік тому

      some older devices are not supported by ArCore, google for ArCore supported devices and you'll find a list

    • @ollied2025
      @ollied2025 Рік тому

      @@cgtinker ah that would explain it then - thanks!

  • @benjifelicce
    @benjifelicce 2 роки тому

    Hi!, thanks for share ur awesome work! i did it as the tutorial say, but when i have to generate a face rig to rig to animated emptys, the face rig appear all mixed up! and i did change the names first! i dont know really what it can be the problem, i tried everything, can you help me?

    • @cgtinker
      @cgtinker  2 роки тому

      Did you select the device type or changed any bone names?

    • @benjifelicce
      @benjifelicce 2 роки тому

      @@cgtinker i did not change any bone name, wich one is the device type?

    • @benjifelicce
      @benjifelicce 2 роки тому

      oh!!, it was IOS not android!, i'm gonna try again. thanks!!

  • @llYuki0okami
    @llYuki0okami Рік тому

    I cant get this app, i get "this app will not work on your device", no damn specifics

    • @cgtinker
      @cgtinker  Рік тому +1

      Most likely your device doesn't match the system require of your device. there are lists from google and apple for ArCore and ArKit supported devices respectively. ArCore or ArKit is required to run BlendArTrack

  • @vfxlander
    @vfxlander 2 роки тому

    thank you so much for tuto, which app did you use to track face animation please?

    • @cgtinker
      @cgtinker  2 роки тому +1

      blendartrack, check the links in the description ;)

  • @1murkeybadmayn
    @1murkeybadmayn 2 роки тому +1

    sorry how do you save the driver rig so that I can import and use it in another project?

    • @cgtinker
      @cgtinker  2 роки тому +1

      After transfering just stash the animation in the action editor. Then you have a nla strip you can easily work with in other projects.

    • @1murkeybadmayn
      @1murkeybadmayn 2 роки тому

      @@cgtinker What i meant is to save the rig itself so i don't have to reconstruct it everytime i need to use it for a different animation.

    • @cgtinker
      @cgtinker  2 роки тому

      @@1murkeybadmayn not really sure if I understand the question.
      usually you rig a geometry bind it and use it as character. If you use a rigify rig while doing so, you should be able to transfer the animation to the characters face using the add-on.
      Animations on a rig can be stored on the character as nla strips (to reuse them). The usual workflow to do so is baking the animation and stashing the result.

    • @1murkeybadmayn
      @1murkeybadmayn 2 роки тому

      @@cgtinker No I meant to resue the rig itself not the animation. Beause I don't want to have to rebuild it for something else but nevermind. I have a bigger issue. I did the weightpainting thing but my neck is still deforming even after adding the support bones. I don't know what is going on. You also mentioned shift click the bones when weight painting, what does this mean because I cannot click on the bones when weight painting.

    • @1murkeybadmayn
      @1murkeybadmayn 2 роки тому

      @@cgtinker @cg tinker Does not matter. The weight painting did not work, the nla animation just disappears once you quit blender and reopen blender anyway, it does ot save. Also, after baking the rig and transfer of the animation, it just broke my model as half the face was separate from the model. The eyes and did not bind to the bone and the head separated from body of my model. I tried to follow every detail in the video, very frustrating for a whole week I was at this and still end up with nothing working. Overall, it was very hard to follow your video. The video is not in sync with the instructions in the video and it was too fast even when I played it at half speed and you skipped some bits when you were clicking, had no idea what you clicked to get certain things up. I think you should take time to do the demostration while talking not doing the audio and video separately. Would also help if it was slower and not skip parts in what you're doing so we can follow it.