Boo-yah! I've been following the facial mocap scene for the past two years and it's so exciting there's finally a truly free workflow from beginning to end. This is going to open the door for so many people 🥳
This is so excellent! I have been searching for months for a way to get mo-cap from pre-recorded video. The Audio2Face workflow was too complicated for me and I definitely prefer something integrated into blender. This was perfect!! I already have FaceIt and had the model pretty much set-up. I followed this tutorial tonight and within no time I had a result. I was surprised how well it captured the head tilts. In the video, the person was wearing glasses and had a lot of glare on the lenses, and surprisingly the software caught most of that movement. I have a lot of clean-up to do but this works great as a starting point. Thank you so much for making this tutorial!
This video should be part of YT recommendations for any artist or developer. Great work you're doing with animation and mocap. And a big thank you to all devs who made these two addons available for free.
@@CGDive Indeed. Would you show the proper way to bake and export the facial blend shape animation to FBX? I wanted to import the csv in Blender and send it to other DCCs. Thanks. Keep it up.
Thanks for your video! I've been following the channel for a long time and this is one of my favorites. IMHO This is the best channel about creating animations. You do a great work!
Thank you so much for this video ,You have solved my biggest problem as well as doubt or issue ,from past two years i was wondering about the facial expressions in animation. Thank you once again god bless you
This is so really good! Both as info and tech... but it's still seems like it could use some manual tweaking after because it didn't seem to capture all your acting as well as it could have
Yeah, its not great. Google mediapipe released a lightweight face and landmark recognition model. That is what I used in the app. They have better heavy models not accessible to the public. Some commercial products possibly train their own models or purchase usage of the better google models. But hey its free (the app creator here)
Hi there, first try I did it, was successful, then tried again, faceit and live link face and I got this error multiple times ! Python: Traceback (most recent call last): File "C:\Users\myname\AppData\Roaming\Blender Foundation\Blender\3.6\scripts\addons\faceit\mocap\mocap_base.py", line 842, in execute for x in futils.get_action_frame_range( File "C:\Users\myname\AppData\Roaming\Blender Foundation\Blender\3.6\scripts\addons\faceit\core\faceit_utils.py", line 12, in get_action_frame_range frame_range = action.curve_frame_range AttributeError: 'NoneType' object has no attribute 'curve_frame_range'
Thank you so much >> you still the best and you save the world , I am looking now for hand capture with finger movement .. I could not find it.. I hope you can help us (all the world) in this matter after you solve the problem by a nice video ( the above).
Hi, thank you. I am facing a problem. When I try FaceIT, it reacts in the opposite way when I turn my head. When I turn the head left/right in the video, the character's head turns up/down in Blender. So it reacts in the opposite way - in the video I shoot the head up/down - in Blender it turns left/right. I am using auto rig pro with faceit.
This is great! I think a great addition to the suite of tools would be a way to scale up the responsiveness of the captured rig points, i.e. selecting the brows, lips, jaw and setting a multiplier for scale for more intense/detailed performance capture. Is the capture framerate based on the set framerate within blender or does it capture at the camera framerate and interpolate frames? Either way, I'll definitely be grabbing the add-ons to play with! I'm working on a character in unreal engine and so far I've had to just keep the face static.
Hi! When I transfer a model with a faceit rig, the faceit addon stops seeing the rig. The ability to animate with LiveLinkFace is lost. How can I connect the addon to the ring if it has lost it.
Is there a way to do live capturing on the app without an iphone tho? bc I have only seen video do cvs and then you import but live capture only has 4 apps and all of them are iphone exclusives
hey guys just curious how does this quality compare to iphone mocaps? if i use a webcam or android phone (without depth sensor), is it still comparable quality to iphone mocaps? or slightly worse?
Hi, this is amazing but I am wondering if this also works without faceit? It's quit expensive for starters. I'm using daz3d models that comes with a face rig as well or I can manully ad a face rig with the metarig from blender. Will that also works?
thanks for your quick response! I'm sorry if i missed the answer within the tutorial. I did watched it and the full tutorial is applied on a faceit-rig but yes I skipped some parts so again sorry if the answer is in there@@CGDive
Hello. When I launch the Face_Landmark_Link program, I select a video to track and the program simply closes and creates an empty csv file. What could be wrong? Has anyone encountered such a problem?
There's an error showing up while importing the CSV file. It says, ValueError: could not convert string to float: 'V'. I put a mov file on the load audio section and a csv file on the csv import section. Kept the frame similar to the mov file. Then it showed error when I clicked import csv.
Thanks so much for the tutorial. I used the free addon. The csv file loads with the keyframes inserted but has no effect on the character. What could be the possible solution
@@animbenard9747 I don't use CC so I can't say for sure, sorry. I believe CC character do have ARKit shapes or at least something similar so you have to make sure that you are bringing them into Blender. And also that the shape names are correct. Good luck!
It didn't ask me to import a video file. It just launches my webcam and starts to track live footage. I don't know where to find the csv file when I stop the tracking.
Thanks! I have looked at it, and do not see why it can't run on Linux or most other systems as it made in Python with numpy and Tk, but I'm not much of a programmer either and can't get the libraries to work together. Anyone here up for the task?
I can't say why. The links in the description box seem to be correct. The mocap extraction app github.com/Qaanaaq/Face_Landmark_Link/ The blender addon: nickfisher.gumroad.com/l/tvzndw?layout=profile
Good day guys , am a new subscriber ..i howsoever downlaoded face landmark but it shows two options saying (mild filtering ....) and the second which says (aggressive filtering ) when i try to install it , and neither of this options seems to work out ....please assist .
For Auto-Rig Pro face rig? No, it only works for a specific set of shapekeys, the so called "ARKit" shapes. Some info here ua-cam.com/video/KQ32KRYq6RA/v-deo.html
I don't mean to be rude but do you really expect that someone can answer this? Hey, my car doesn't start. Why? Hey, my fridge doesn't cool. Why? I hope you see what I mean. :)
Dear Sir/mam, I am facing ' The specified file is not valid' error while importing Mocap. Please guide me as the addon without this features is worthless for me and i just purchased it after watching this video. Thanks
Hello. You didn't specify which addon you mean. But even if you did, these are not my add-ons. I am just make tutorials about them. Please contact the developer from whom you purchased the product that you are having trouble with. Thank you!
I can't say. It could be too many things from literally using an invalid file, to blender versions, addon versions, using the wrong feature, etc. etc. Try again. If the problem persists, you can try to share a bug report with the related tool developers.
@@CGDive thank you for the reply.. I emailed the faceit developer.. and together with the csv file.. for now I use hallway cubes osc stream function, just worry the load on my pc will be too greats since I use brekel body with 3 kinects for body motion capture and hallway cube for facial motion capture together.. by the way thank you so much for providing such detailed great quality tutorial
there's a video about using Nvidia Omnverse Audio2Face that will generate the ARKit blendshapes. But I'll say -- it's not easy. I got the blendshapes out to Blender, now trying LiveLinkFace -- 1st time, animations do not play
Boo-yah! I've been following the facial mocap scene for the past two years and it's so exciting there's finally a truly free workflow from beginning to end.
This is going to open the door for so many people 🥳
DUDE!!!! I love that strike through the word IPHONE.. Makes me so happy
poor fak
This is so excellent! I have been searching for months for a way to get mo-cap from pre-recorded video. The Audio2Face workflow was too complicated for me and I definitely prefer something integrated into blender. This was perfect!! I already have FaceIt and had the model pretty much set-up. I followed this tutorial tonight and within no time I had a result. I was surprised how well it captured the head tilts. In the video, the person was wearing glasses and had a lot of glare on the lenses, and surprisingly the software caught most of that movement. I have a lot of clean-up to do but this works great as a starting point. Thank you so much for making this tutorial!
😭😭😭you have no idea how much I've needed this, especially when the reallusion guys have just dropp their ai facial mocap stuff
The reallusion thing looks much more advanced than this. But hey, it’s free!
This video should be part of YT recommendations for any artist or developer. Great work you're doing with animation and mocap. And a big thank you to all devs who made these two addons available for free.
Nice, glad you found it useful! ❤
@@CGDive Indeed. Would you show the proper way to bake and export the facial blend shape animation to FBX? I wanted to import the csv in Blender and send it to other DCCs. Thanks. Keep it up.
@@LFA_GM shape keys should export out though FBX without any special preparation!
Thanks@@CGDive . I'll give it a try.
Thanks for your video! I've been following the channel for a long time and this is one of my favorites. IMHO This is the best channel about creating animations. You do a great work!
Wow, thanks! Appreciate the nice words :)
@@CGDivesir please share the link to the application
@@Livingbyygrace Do you mean the application from the video? All links are in the video description.
I've been waiting for this video for a long time! You are the best!
Hope you enjoy it!
13:41- OMG, "COMBINE" finally works in local SRT with offser! Gosh! It took forever to see it fixed. Thank you for another amazing video!
Yeah, not sure when it was fixed. It must have been a couple of versions back.
Thank you so much for this video ,You have solved my biggest problem as well as doubt or issue ,from past two years i was wondering about the facial expressions in animation. Thank you once again god bless you
This is perfect! I actually don't need the head rotations
dude I love you! Thank you for sharing your knowledge. Your videos are perfect! You are incredible! Thanks!
haha, I am getting a lot more "I love yous" on this video than I am used to. Seems like it is addressing a real problem. Nice!
The dead eyes tho. But still pretty impressive, probably useful with a bit of tweaking.
big time thanks for this! just what i needed!
Nice!
This is so really good! Both as info and tech... but it's still seems like it could use some manual tweaking after because it didn't seem to capture all your acting as well as it could have
Yep, that's why I included the parts in tweaking the mocap 🙂
Yeah, its not great. Google mediapipe released a lightweight face and landmark recognition model. That is what I used in the app. They have better heavy models not accessible to the public. Some commercial products possibly train their own models or purchase usage of the better google models. But hey its free (the app creator here)
I love you so much man! What would we do without you?
I'll try it next as soon as I finish my job. Thank you so much!
Very first professional experience
Hi there, first try I did it, was successful, then tried again, faceit and live link face and I got this error multiple times !
Python: Traceback (most recent call last):
File "C:\Users\myname\AppData\Roaming\Blender Foundation\Blender\3.6\scripts\addons\faceit\mocap\mocap_base.py", line 842, in execute
for x in futils.get_action_frame_range(
File "C:\Users\myname\AppData\Roaming\Blender Foundation\Blender\3.6\scripts\addons\faceit\core\faceit_utils.py", line 12, in get_action_frame_range
frame_range = action.curve_frame_range
AttributeError: 'NoneType' object has no attribute 'curve_frame_range'
Same problem , can someone give me an advise on why this problem , ? after yes twice , it open and close it I don't know why
Hi there, did you find a solution?
@@yosrh8294 no
Thank you so much >> you still the best and you save the world , I am looking now for hand capture with finger movement .. I could not find it.. I hope you can help us (all the world) in this matter after you solve the problem by a nice video ( the above).
thanks so much
I love doing by hand 😊
Hi, thank you. I am facing a problem. When I try FaceIT, it reacts in the opposite way when I turn my head. When I turn the head left/right in the video, the character's head turns up/down in Blender. So it reacts in the opposite way - in the video I shoot the head up/down - in Blender it turns left/right. I am using auto rig pro with faceit.
Did you figure out a solution by any chance? :')
@@chiddin2895 still not
This is great! I think a great addition to the suite of tools would be a way to scale up the responsiveness of the captured rig points, i.e. selecting the brows, lips, jaw and setting a multiplier for scale for more intense/detailed performance capture. Is the capture framerate based on the set framerate within blender or does it capture at the camera framerate and interpolate frames? Either way, I'll definitely be grabbing the add-ons to play with! I'm working on a character in unreal engine and so far I've had to just keep the face static.
very useful
Hi! When I transfer a model with a faceit rig, the faceit addon stops seeing the rig. The ability to animate with LiveLinkFace is lost. How can I connect the addon to the ring if it has lost it.
Thank you very much !
a dream come true
Thank you.
Would be nice to know how well it tracks the eyes. Because the eyes are the biggest issue here. Not just the eyelids.
It's hard to quantify how well it does it. I think it tracks it as well as the other features. So not perfect but somewhat accurate.
Is there a way to do live capturing on the app without an iphone tho? bc I have only seen video do cvs and then you import but live capture only has 4 apps and all of them are iphone exclusives
Thank you man Thank you so much thank you ×100×♾️🥹
No problem 👍
so cool, is there something for UE5 and any phone? I just use Metahuman
not sure about that sorry. I am planning to focus on UE soon!
Omg finally!
hey guys just curious how does this quality compare to iphone mocaps? if i use a webcam or android phone (without depth sensor), is it still comparable quality to iphone mocaps? or slightly worse?
For my part, face landmark only worked once and since then, there is a bug when opening just with the promt command
report to the developer!
Hi, this is amazing but I am wondering if this also works without faceit? It's quit expensive for starters. I'm using daz3d models that comes with a face rig as well or I can manully ad a face rig with the metarig from blender. Will that also works?
It does works without faceit. Did you watch the video? :D
I am not familiar with DAZ or its face rig so I can't help there, sorry.
thanks for your quick response! I'm sorry if i missed the answer within the tutorial. I did watched it and the full tutorial is applied on a faceit-rig but yes I skipped some parts so again sorry if the answer is in there@@CGDive
LiveLinkFace wont work with CSV file. My character face stays idle
Hi, thanks for his vid I wanna askdoes this work with Linux?
The app that processes the video is built for windows. I think the code is available and it can be built for Linux by someone who knows how to do it.
Once you do the mocap in blender, can you bring that information into Unreal Engine 5?
It should be possible
Nice🎉👍🏻
🎉
Thank you❤
Welcome!
For knowledge it is ok
But rhubarb is best for short videos
Hello. When I launch the Face_Landmark_Link program, I select a video to track and the program simply closes and creates an empty csv file. What could be wrong? Has anyone encountered such a problem?
I am not sure, sorry. Your best with technical issues bet is to ask the developer .
Can same thing done in unreal with metahuman without iphone
Not sure.
There's an error showing up while importing the CSV file. It says, ValueError: could not convert string to float: 'V'. I put a mov file on the load audio section and a csv file on the csv import section. Kept the frame similar to the mov file. Then it showed error when I clicked import csv.
I cannot help with app errors. Please get in touch with the tool developer.
Master Class
Thanks!
Plz make same for unreal engine
Thanks so much for the tutorial. I used the free addon. The csv file loads with the keyframes inserted but has no effect on the character. What could be the possible solution
I can't say for sure. Does the character has ARKit shape keys?
I used a character from character creater 3. Is their shape keys different from Arkit shape keys. How can I make the Arkit shape keys. Thanks bro
@@animbenard9747 I don't use CC so I can't say for sure, sorry. I believe CC character do have ARKit shapes or at least something similar so you have to make sure that you are bringing them into Blender. And also that the shape names are correct. Good luck!
@@CGDive Thank you so much bro.
Renaming the shapekeys to ARKit worked fine. Thanks boss. U did great
It didn't ask me to import a video file. It just launches my webcam and starts to track live footage. I don't know where to find the csv file when I stop the tracking.
I can't say for sure. Please ask the app developer!
I realised I downloaded the wrong version. Thank you for your swift reply!
Thanks! I have looked at it, and do not see why it can't run on Linux or most other systems as it made in Python with numpy and Tk, but I'm not much of a programmer either and can't get the libraries to work together.
Anyone here up for the task?
According to the developer, it can be built for Linux or Mac. It's just that they only provide a pre built version for Windows.
I keep getting Github pages that don't match your description. Can you post the exact link? Thank you.
I can't say why. The links in the description box seem to be correct.
The mocap extraction app
github.com/Qaanaaq/Face_Landmark_Link/
The blender addon:
nickfisher.gumroad.com/l/tvzndw?layout=profile
You can make mocap with wrinkles animation ?
If you set up the wrinkles, sure!
@@CGDive what I want to say is to suggest you to make a tutorial like what I mentioned up
how to do it in cinema 4d?
ask cinema 4d :)
Good day guys , am a new subscriber ..i howsoever downlaoded face landmark but it shows two options saying (mild filtering ....) and the second which says (aggressive filtering ) when i try to install it , and neither of this options seems to work out ....please assist .
Try asking the tool developer, that's your best bet.
Hi, you downloaded the live streaming version. Try the normal version.
this method work with auto rig pro?
For Auto-Rig Pro face rig? No, it only works for a specific set of shapekeys, the so called "ARKit" shapes. Some info here
ua-cam.com/video/KQ32KRYq6RA/v-deo.html
@@CGDive any another way to make 52 arkit blendshape key easily in blender?
It's work with unreal engine?
Shake keys work with unreal.
hey face land mark is not installing, why????
I don't mean to be rude but do you really expect that someone can answer this?
Hey, my car doesn't start. Why?
Hey, my fridge doesn't cool. Why?
I hope you see what I mean. :)
Does this work for Rigify 3.22?
Not sure. Test it. :)
i'm pretty sure rigify doesn't have ARKit Shapes soo...
@@jaim2 Oh, it says Rigify. I assumed they meant Blender 3.2
@@CGDive yeah i know, but still, rigify (face rig) doesn't have ARKit Shape Keys.
oh wait nvm, i realized that it is possible in a paid addon, Faceit.
Facelandmarker is only for windows!?
I think so unless you build it for your OS.
i'm having trouble running face landmarker on my Mac OS (running windows virtually)...how can I build it for my macOS? thanks in advance!@@CGDive
@@mentawaisurftrips I hope you are not expecting help from me personally because I have no idea, sorry.
LiveLinkFace not working. Console flashes for a split second and disappears.
Contact the developer!
Also this tutorial is not about LiveLinkFace lol
@@CGDive I was just being stupid. I figured it out.Thanks.
Dear Sir/mam,
I am facing ' The specified file is not valid' error while importing Mocap. Please guide me as the addon without this features is worthless for me and i just purchased it after watching this video. Thanks
Hello. You didn't specify which addon you mean. But even if you did, these are not my add-ons. I am just make tutorials about them. Please contact the developer from whom you purchased the product that you are having trouble with. Thank you!
I was talking about Faceit, But now my issue is resolved after switching to simple files of addon (previous version)@@CGDive
@@Bhuttaanimation great to hear!
"The specified file is not valid." why i get this error when import csv into faceit
I can't say. It could be too many things from literally using an invalid file, to blender versions, addon versions, using the wrong feature, etc. etc. Try again. If the problem persists, you can try to share a bug report with the related tool developers.
@@CGDive thank you for the reply.. I emailed the faceit developer.. and together with the csv file.. for now I use hallway cubes osc stream function, just worry the load on my pc will be too greats since I use brekel body with 3 kinects for body motion capture and hallway cube for facial motion capture together.. by the way thank you so much for providing such detailed great quality tutorial
I have the same error. Latest version of faceit.
@@unknownproductions9879 later I will try the older stable version, currently i am using the beta version, not sure is this the causes
Dear i am also facing the same issue, please guide if you got the solution@@raymondwood5496
and so not useable when its windows only
Still nothing exist to make the life easier for the blendshape ?
I really want to work on that, and I think I can solve it, but I would need someone to fund me to do it :\
there's a video about using Nvidia Omnverse Audio2Face that will generate the ARKit blendshapes. But I'll say -- it's not easy. I got the blendshapes out to Blender, now trying LiveLinkFace -- 1st time, animations do not play
I need capturint
What is "capturint"?
Reallilluison crying in corrner
exe file not in this
Not sure what that means. All links work!
THIS IS FREE?
Yep 👍 (except for FaceIt)
Apache license. 🙂
Wasn't really impressed with the result unfortunately. 90% of your expression was lost in the conversion it looks like. Especially with the eyes.
If you expect to get Gollum out of the box without any tweaks, being disappointed is to be expected.
. As pessoas querem excelentes resultados sem o devido esforço. É o mundo de hoje, triste de ver!
@@CGDiveplease how can I get the application
@@Livingbyygrace Do you mean the application from the video? All links are in the video description.