This works so light on my pc ..and its not even powerful one, I think you and blender should work together and make this feature as a normal addon, both of you would benefit alot !
Somehow UA-cam algorithm gave me your video in my feed and I cannot complain this what a good choice. Thanks you for making content and your work available for everyone, keep it up!!
Omg I just thought about something like that before I got to sleep yesterday. I was like "man it would be awesome if I could just stand in front of a camera and make a walk animation inside of blender myself" AND HERE IT IS Totally going to test that out, thank you!
Since the empties are not parented there is no hierarchy. Is it possible to parent the empties to each other, for example the hand, shoulder be parented to the forearm, forearm parented to arm, and arm parented to shoulder, and so on. How would I go about doing that?
@@cgtinker I found an addon on the Blender market called Empties to Bones, which can convert empties to bones and then be used with Auto-Rig Pro's Remap tool to transfer bone animations: However, in order for the addon to function, the empties must be parented using the hierarchy described above. Because I don't know how to write a script, I'm guessing the addon won't work. I was hoping to find a way to retarget my animation onto Mixamo or other rigs besides rigify.
@@cgtinker I'm excited to see how it works, as well as the Holistic (Experimental) and Leg animation transfer (Experimental) with foot locking. I appreciate how people like you are making mocap accessible to everyone. It is not an easy task, so keep up the good work. Thank you for your hard work in making this happen!
I want to thank you so very much. I have just installed it, but I truly believe that you are going to save me a lot of time in mocap and animation. I love the face detection and all the rest too. Full body is unbelievable. Thanks again!!!
Thankyou man, i tried AI mocap services but they are soo expensive. Needed something like this very badly. I'm glad you doing this.
2 роки тому+1
I really admire you, your work has saved many enthusiasts like me, I will support you with a cup of coffee, hope you develop it more like the sound for the mouth joint, I Will wait and buy from you, wish you a lot of health, I love it , thank you
[02:58] Das metarig unsichtbar zu machen und das rig sichtbar, hat mich etwas verwirrt^^, Dachte wieso bewegt es sich nicht mit wie im Video? Super Sache
Thank you making this mockap, And sorry for , Downloding free, Because I am on learning face, also I don't have money for supporting you. Thanks buddy for this, Hope your channel is grow
Hey man, really like your video! Found out about MediaPipe yesterday and today about your work. I've been developing a Blender addon (solely face mocap) for myself over the past few weeks too. It uses some heavy maths with projections from 3D to 2D and vice versa, so I guess it is by far a lot easier this way, both implementation and maintenance wise. Am excited to compare it with your work, after I'm done changing to MediaPipe too! Keep up the good work man, it's motivating to have others around with similar hobbies, hehe :)
thanks! keep it up ;p using mediapipe is not to easy as all points are in global space, so it's still lot's of work. therefore the facial animation part seems very similar to 2D - when you check it out you'll see.. :D mind to share you git?
@@cgtinker You're probably right, I shouldn't assume anything before actually facing it :P As for my Repo, I didn't create one yet... But I think it's a good time to set up one. I'll share it with you in the future and will be looking forward to feedback!
A very big help for us. I did everything as in the lesson. Although I am a complete zero in blender. There was simply a great desire to revive the model. I have only one question - Is it possible to somehow make the model repeat the movements in real time instead of the Transfer? so as not to record the movements, but immediately the model repeated in the render window. I'm really looking forward to this opportunity. I will be ready to thank and support the project with money
Yea but needs a fast machine! If you just record for a second and directly transfer the character is linked to the tracking results. If you then start recording again, you will see realtime results on the character.
I guess it only works on "Rigify". So, can't the scanned points be transferred as tracks? So we can connect these tracks to the bones in our armature. Also, isn't the full body (body + face + hands) rendered? Maybe this is possible with a triple camera system. But even now being able to do this in real time is exciting. So thank you for offering such a plugin for free.
Thanks for this! I want to create the code for motion capture myself in the nearest future (I am studying it now), but decided to try using addons as well until.
this is awesome!! Great video! Quick question: Could we use hand gestures to drive animation. (If I do index finger pointing up (1# pose)", can I scale an object?
haha funny idea. uhm well obv it isn't made for that but guess you kinda can. you'd have to set up drivers for that but that shouldn't be too hard. for this case, a simple sample: create a cube, add a driver to the scale, use the x-rotation value of the "index mcp driver" to drive the scale.
can't get it to work unfortunately I've tried on windows and Mac, 3.0 and 3.1, I am running with elevated privileges on both, it says ...returned non zero exit status 1, seems like an awesome add-on though thank you for your hard work on it
Bummer, whats your operating system? I think 10.15+ is required. If you are using a silicon mac, make sure to download the intel version of blender. the silicone blender version is not supported.
It is recommended that multiple channels can be input at the same time and can be captured together. The face is not supposed to consider using the blend shape key.
@@cgtinker translation problem. 1 If multiple cameras can record and analyze at the same time, it is convenient for the characters to perform without repeatedly recording hand movements and body movements. 2 I have watched a lot of ai expression tracking, in fact, Ziva also drives blender's shape keys by analyzing the facial shape, so that some factors of unstable tracking can be supplemented by shape keys
Great tutorial, very well explained. Thank you. I'm trying to use the add-on on an animation made with the rokoko live plug-in and a mixamo rig. However, when I try to transfer the detection, it doesn't work. It places the animation to the side of my figure. Any help would be greatly appreciated
Thank you for this update. I noticed in the video the legs did not move after you did the pose transfer, may i ask why? thanks again, really appreciate it.
The only limitation I see is the webcam for this addon. I have an iMac and my room is not as big to capture my full body. So it would very difficult to adjust the 27inch iMac for the proper position. I wish you add pre-recorded video option to shot motions with our phones from any places 🙏💫
Is there any way that you could do a video on how to transfer the motion captured data to a Daz3d character, such as exporting a bvh file or is it possible to export an fbx file that is "mixamo compliant", so that it can be used on a Daz3d character? Thanks in advance
That's awesome. May I ask though why it's only working with the old face rig? Are there benefits to the old rig or do you plan to support the new face rig as well? Thanks again!
It is a really an awesome add on ....i was thinking and what if you make it import a video file from the pc and track from it with high precision. It will be good for pose tracks and the legs b/c we will record in different env't and can take the animation in other footages. Any ways thanks alot😊
Honestly, I've been trying to look into figuring out how to do this. It almost seems like most of the code is there with this addon. Just gotta input a video file, instead of Webcam footage. Theoretically, it shouldn't be that hard to implement. I'll look into it when I have time this week.
I've been trying to install it but I can't get mediapipe 0.8.10 installed on my m1 mac. It says "ERROR: Could not find a version that satisfies the requirement mediapipe>=0.8.10 (from versions: none) ERROR: No matching distribution found for mediapipe>=0.8.10". I see some people on github have this issue as well. I don't know where to go from here. Can you help? I am new to all this so I don't know what I am doing wrong
great job,, but may i ask you.. last time i try blendartrack,, i finish install the add on but there is only step for installing the add on... any tips??
I don't really understand. blendartrack doesnt require external dependencies. When setting up blendarmocap, it's required to run blender as admin and click it "install dependencies" button in the add-ons preferences
@@cgtinker thanks for your answer,, i'm sorry for my late reply.. now i can use blendartrack without the issue... your software really helpfull for the community... after my own project finish and get the pay,, hope i can be your patreon :D... sorry for my bad english,, greeting from Indonesia
Hi CG! Fisrt of all fantastic add-on! I'm a complete novice at this stuff and im trying to map face-mocap from BlendARmocap on to an already existing face model that I rigged using your "blendartrack - ar face tracking to rigify rig transfer" video. I can capture the motion data and transfer it to a Metarig like you demonstrate in this tutorial but Im running in to errors when I try to transfer the same data to my pre-existing rigged model. Is it possible to do this, and if so, how? Many thanks.:)
Hello There, Love the addon. I am studying this topic as well. I am not as good, but may I ask if there will be a possible way to use pre-recorded videos as an option for the feed of mediapipe?
Great work. I have a question: I followed the steps you mentioned in the video. The mediapipe tracking looks fine. However, when transfering the animation, the hands are backwards and when I bend my fingers they bend towards the back of the hand. Do you know what might be the reason for this problem? Thanks
I cant copy and paste flip positions so for example say if I do a fist on one hand and I want to copy it to the other because only one hand is in the shot how would I go about doing that? Also how do I delete motion so there is only one posed frame?
Hi! Is it possible to get more fps than that? I mean, it's so jittery, and I don't know if it's from my webcam or what. Or I must using better webcam? Thankss!
Hi CG Tinker, would love to try out your Add on, but cannot install the dependencies although I opened blender as admin (on MacOS). Would be grateful for a hint!
cannot really tell based on the information. however, I'm working on a standalone executable to get rid of dependencies in blender.. will take a while though
Hi!, I followed the tutorial, i selected the zip and checked the box, but the "install dependencies" button does not appear, is there any solution? I started blender as administrator
Hello, I just bought the addon and I downloaded version 152, but on Blender V3.3 the addon does not even install, there is an error message every time I click on "install dependencies". I spent more than 5 hours but I can't install it on Blender V3.3 at all. I need your help. Thanks.
I love it how People like u are making mocap available for everyone.
Thank you!
This works so light on my pc ..and its not even powerful one, I think you and blender should work together and make this feature as a normal addon, both of you would benefit alot !
agree... @cgtinker are a game changer
How to download in blender bro please tell me as soon as possible
My words can't describe how much respect I have for this man
agree,, this man so generous
Somehow UA-cam algorithm gave me your video in my feed and I cannot complain this what a good choice. Thanks you for making content and your work available for everyone, keep it up!!
This is a real game changer for the Blender community. Thank you for this great app!
Omg I just thought about something like that before I got to sleep yesterday.
I was like "man it would be awesome if I could just stand in front of a camera and make a walk animation inside of blender myself"
AND HERE IT IS
Totally going to test that out, thank you!
CG thank you so much, this is a real gift to the community; if i can make it work in iclone I will absolutely donate, and hope others do, too
Fascinating dude, following this project very closely ❤️
Since the empties are not parented there is no hierarchy. Is it possible to parent the empties to each other, for example the hand, shoulder be parented to the forearm, forearm parented to arm, and arm parented to shoulder, and so on. How would I go about doing that?
Their positions are all in global space, so not really. Best would be great to write a script
@@cgtinker I found an addon on the Blender market called Empties to Bones, which can convert empties to bones and then be used with Auto-Rig Pro's Remap tool to transfer bone animations: However, in order for the addon to function, the empties must be parented using the hierarchy described above. Because I don't know how to write a script, I'm guessing the addon won't work. I was hoping to find a way to retarget my animation onto Mixamo or other rigs besides rigify.
@@recreatingcontents5795 you know what, I like the idea of an empty hierarchy and a ground-truth rig. I'll implement that.
@@cgtinker I'm excited to see how it works, as well as the Holistic (Experimental) and Leg animation transfer (Experimental) with foot locking. I appreciate how people like you are making mocap accessible to everyone. It is not an easy task, so keep up the good work. Thank you for your hard work in making this happen!
This is just what I need. Thinking of doing some free educational health videos
brother is this legit ? im mindblown ,you should be up in the sky man , u clearly dont get enough recognition wich u deserve
This is amazing. Blender continues to blow my mind. Thanks for this!!
Keep it up man this is great!
This is fantastic! I tried to do this a year ago, but just don't have the skills. Thank you so much!
Thanks! To actually use the data on a rig is quiet though! Took me way longer than initially expected.
Fantastic !
Is this rig usable as vrm for live tracking ?
I want to thank you so very much. I have just installed it, but I truly believe that you are going to save me a lot of time in mocap and animation. I love the face detection and all the rest too. Full body is unbelievable. Thanks again!!!
CG, Thank you very much! You are such a patient and kind guy!!!
Thankyou man, i tried AI mocap services but they are soo expensive. Needed something like this very badly. I'm glad you doing this.
I really admire you, your work has saved many enthusiasts like me, I will support you with a cup of coffee, hope you develop it more like the sound for the mouth joint, I Will wait and buy from you, wish you a lot of health, I love it , thank you
thanks a lot!
I'll keep it on - have already some updates in mind :)
Woow, This is Amazing! Thank you for this addon this is huge timesaver for indie-game development!
Hey there, I’m having a problem sir, I can’t get mediapipe to say true in the dependencies
Wow, voll der Hammer. Grüße aus Berlin und Danke!
This looks great!
man i just join this channel be your fan in 5 min .you are the best!
[02:58] Das metarig unsichtbar zu machen und das rig sichtbar, hat mich etwas verwirrt^^,
Dachte wieso bewegt es sich nicht mit wie im Video?
Super Sache
gutes feedback, danke :)
werde es in Zukunft zur Seite schieben.
Thank you making this mockap,
And sorry for ,
Downloding free,
Because I am on learning face, also I don't have money for supporting you.
Thanks buddy for this,
Hope your channel is grow
thanks, hope you have a gd learning experience within the blender community :)
Thank you so much for your effort. The work you are doing really makes people's lives easier.
This is great! Please keep going!
Hey man, really like your video! Found out about MediaPipe yesterday and today about your work.
I've been developing a Blender addon (solely face mocap) for myself over the past few weeks too.
It uses some heavy maths with projections from 3D to 2D and vice versa, so I guess it is by far a lot
easier this way, both implementation and maintenance wise. Am excited to compare it with your
work, after I'm done changing to MediaPipe too!
Keep up the good work man, it's motivating to have others around with similar hobbies, hehe :)
thanks! keep it up ;p
using mediapipe is not to easy as all points are in global space, so it's still lot's of work. therefore the facial animation part seems very similar to 2D - when you check it out you'll see.. :D
mind to share you git?
@@cgtinker You're probably right, I shouldn't assume anything before actually facing it :P
As for my Repo, I didn't create one yet... But I think it's a good time to set up one. I'll share
it with you in the future and will be looking forward to feedback!
A very big help for us. I did everything as in the lesson. Although I am a complete zero in blender. There was simply a great desire to revive the model. I have only one question - Is it possible to somehow make the model repeat the movements in real time instead of the Transfer? so as not to record the movements, but immediately the model repeated in the render window. I'm really looking forward to this opportunity. I will be ready to thank and support the project with money
Yea but needs a fast machine!
If you just record for a second and directly transfer the character is linked to the tracking results. If you then start recording again, you will see realtime results on the character.
update again! thank you! you're amazing!
absolutely wonderful stuff youve made. thank you!
what a great job worth millions
I guess it only works on "Rigify". So, can't the scanned points be transferred as tracks? So we can connect these tracks to the bones in our armature. Also, isn't the full body (body + face + hands) rendered? Maybe this is possible with a triple camera system. But even now being able to do this in real time is exciting. So thank you for offering such a plugin for free.
Merci! really thanks from france for your help and you're work you're amazing
Man, it's so fucking crazy! It's really important addon for animation! Thank you!
Took downloaded everything works
I just tested the addon. It's perfect. It's great..
Thanks for this! I want to create the code for motion capture myself in the nearest future (I am studying it now), but decided to try using addons as well until.
I’m in the blender rabbit hole foreal now
Amazing work! You fantastic
I wish u could use non-rigify bones for this to work.
Hi, When installing dependencies. It shows error - Installation of mediapipe failed. Check system console output. Can anyone help me. Thanks
How would i bake the motion from driver to keyframe
this is awesome!! Great video!
Quick question: Could we use hand gestures to drive animation. (If I do index finger pointing up (1# pose)", can I scale an object?
haha funny idea. uhm well obv it isn't made for that but guess you kinda can. you'd have to set up drivers for that but that shouldn't be too hard.
for this case, a simple sample: create a cube, add a driver to the scale, use the x-rotation value of the "index mcp driver" to drive the scale.
Very exciting !
can't get it to work unfortunately I've tried on windows and Mac, 3.0 and 3.1, I am running with elevated privileges on both, it says ...returned non zero exit status 1, seems like an awesome add-on though thank you for your hard work on it
Bummer, whats your operating system? I think 10.15+ is required. If you are using a silicon mac, make sure to download the intel version of blender. the silicone blender version is not supported.
subscribed, it's an awesome sounding addon!
Wow, this is amazing! Thanks for this work
Thank you sir for this amazing add on
Maaaan, thank you a lot for your content
It is recommended that multiple channels can be input at the same time and can be captured together. The face is not supposed to consider using the blend shape key.
what do you mean?
@@cgtinker translation problem.
1 If multiple cameras can record and analyze at the same time, it is convenient for the characters to perform without repeatedly recording hand movements and body movements.
2 I have watched a lot of ai expression tracking, in fact, Ziva also drives blender's shape keys by analyzing the facial shape, so that some factors of unstable tracking can be supplemented by shape keys
@@cgtinker Of course, I prefer to be able to perform accurate tracking later, so the effect should be better than the effect of real-time tracking
@@dayspring4709 thanks for the feedback!
Such a life saver 😋
Respect!
Looks nice ! May I ask you, do you need an internet connection after all dependecies had been downloaded? Or is it totally internet dependent ?
Once the dependencies are installed you can turn down internet. Runs locally
THANK YOU!!! you're AMAZING!
does this work with blender 4? I'm getting problems installing mediapipe
Great tutorial, very well explained. Thank you. I'm trying to use the add-on on an animation made with the rokoko live plug-in and a mixamo rig. However, when I try to transfer the detection, it doesn't work. It places the animation to the side of my figure. Any help would be greatly appreciated
I guess I don't fully understand.
What's the side of your figure? Also, by default it just transfers motion to generated humanoid rigify rigs
@cg tinker ok I understand now, my rig is not a rigify rig it's a mixamo one
great work brother
Great App mate!
OmegaWOW
RESPECT BRATAN
Very good sir keep creating
this is amazing but do u mind doing a video where u connect the armature to a humen mesh.
I'm modelling a char right now, will prolly take it for the vid! :)
Thank you for this update. I noticed in the video the legs did not move after you did the pose transfer, may i ask why? thanks again, really appreciate it.
well uhm I didn't press the "experimental leg transfer button".. :P
@@cgtinker oh ok thanks.
very nice but instalation proces links where are they for mediapips and rest :(
The only limitation I see is the webcam for this addon. I have an iMac and my room is not as big to capture my full body. So it would very difficult to adjust the 27inch iMac for the proper position. I wish you add pre-recorded video option to shot motions with our phones from any places 🙏💫
Great effort, thanks for sharing 😊
Which model are you using inside mediapipe for pose tracking? I find blaze pose the most stable one
Is there any way that you could do a video on how to transfer the motion captured data to a Daz3d character, such as exporting a bvh file or is it possible to export an fbx file that is "mixamo compliant", so that it can be used on a Daz3d character? Thanks in advance
Thank you. Can you more video as full body specific for person when walk
That's awesome. May I ask though why it's only working with the old face rig? Are there benefits to the old rig or do you plan to support the new face rig as well? Thanks again!
I plan to support it in the future. Main reason is that I usually develop in blender v2.9 and port from there to other versions.
lots of love lots of respect
eres un pan de dios y buena persona, gracias por ayudarme a simplificar el workflow :3 un abrazo
It is a really an awesome add on ....i was thinking and what if you make it import a video file from the pc and track from it with high precision. It will be good for pose tracks and the legs b/c we will record in different env't and can take the animation in other footages. Any ways thanks alot😊
Honestly, I've been trying to look into figuring out how to do this. It almost seems like most of the code is there with this addon. Just gotta input a video file, instead of Webcam footage. Theoretically, it shouldn't be that hard to implement. I'll look into it when I have time this week.
I've been trying to install it but I can't get mediapipe 0.8.10 installed on my m1 mac. It says "ERROR: Could not find a version that satisfies the requirement mediapipe>=0.8.10 (from versions: none)
ERROR: No matching distribution found for mediapipe>=0.8.10". I see some people on github have this issue as well. I don't know where to go from here. Can you help? I am new to all this so I don't know what I am doing wrong
I've heard people had success installing it when using the "blender intel version for mac". The arm blender version isn't supported by mediapipe
Thanks a lot for this awesome work!!! Can I insert keframes for the pose bones also?
thank you so much for this.
amazing app
great job,, but may i ask you.. last time i try blendartrack,, i finish install the add on but there is only step for installing the add on... any tips??
I don't really understand. blendartrack doesnt require external dependencies.
When setting up blendarmocap, it's required to run blender as admin and click it "install dependencies" button in the add-ons preferences
@@cgtinker thanks for your answer,, i'm sorry for my late reply.. now i can use blendartrack without the issue... your software really helpfull for the community... after my own project finish and get the pay,, hope i can be your patreon :D... sorry for my bad english,, greeting from Indonesia
After successfully tracking the movement, how to re-target any 3D model?
amazing man!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
Hi CG! Fisrt of all fantastic add-on!
I'm a complete novice at this stuff and im trying to map face-mocap from BlendARmocap on to an already existing face model that I rigged using your "blendartrack - ar face tracking to rigify rig transfer" video. I can capture the motion data and transfer it to a Metarig like you demonstrate in this tutorial but Im running in to errors when I try to transfer the same data to my pre-existing rigged model.
Is it possible to do this, and if so, how?
Many thanks.:)
Amazing
Thank you so much. Is it possible to make the movement of the face together with the movement of the hands, at the same time?
Yes, absolutely
ua-cam.com/video/qtHf84YJvhk/v-deo.html
Hello There, Love the addon. I am studying this topic as well. I am not as good, but may I ask if there will be a possible way to use pre-recorded videos as an option for the feed of mediapipe?
Not yet, but in the future yep.
@@cgtinker thank you for the reply.
Great work.
I have a question:
I followed the steps you mentioned in the video. The mediapipe tracking looks fine. However, when transfering the animation, the hands are backwards and when I bend my fingers they bend towards the back of the hand. Do you know what might be the reason for this problem?
Thanks
Most likely due to bone rolls. I've covered that in the 'rigging rigify metarig' tutorial.
I cant copy and paste flip positions so for example say if I do a fist on one hand and I want to copy it to the other because only one hand is in the shot how would I go about doing that? Also how do I delete motion so there is only one posed frame?
does this work on pre-rigged character from blender studio?
Super awesome. Would love if it was possible to use Video Files aß sources for the Tracking. But otherwise very cool
Hi! Is it possible to get more fps than that? I mean, it's so jittery, and I don't know if it's from my webcam or what. Or I must using better webcam? Thankss!
It asked me for the "mediapipe" module folder then it now asks me for the "cv2" folder, where can i find it ? 🤔🤔😲😲
Hi CG Tinker, would love to try out your Add on, but cannot install the dependencies although I opened blender as admin (on MacOS). Would be grateful for a hint!
cannot really tell based on the information. however, I'm working on a standalone executable to get rid of dependencies in blender.. will take a while though
@@cgtinker Thank you! Looking forward to it!
Keep going bro😘😘
Hi!, I followed the tutorial, i selected the zip and checked the box, but the "install dependencies" button does not appear, is there any solution? I started blender as administrator
So... i just downloaded a folder with 1mb. That's it? That's the one? which should be used as in the video?
How face data can be transfered to HumanGenerator characters?
Yeah checkout the new release ua-cam.com/video/qtHf84YJvhk/v-deo.html
hi sir how can i set up in window laptop i already install add on but no stream detection cam
Hello, I just bought the addon and I downloaded version 152, but on Blender V3.3 the addon does not even install, there is an error message every time I click on "install dependencies". I spent more than 5 hours but I can't install it on Blender V3.3 at all. I need your help. Thanks.
Really thanks a lot 😊.
Cool addon. How can you use Plask with your facetracking?
Is it possible to transfer animation while tracking?