PLEASE READ: - Metahuman Animator REQUIRES iPHONE 12, NOT 11 - If you want my Live Link Face recording and Move One raw animation to follow along with the tutorial, you can get the files on our Discord: discord.gg/thedarkestage
Incredible and comprehensive tutorial, thank you! As you point out, there are other (shorter) tutorials out there that cover bits and pieces of this info, but nothing as completely packaged and comprehensive as this one you just uploaded! And an excellent summary of the inexpensive body MoCap solutions for us indies ha ha, thanks a ton for doing this!
The sphere method is what I do as well. The reason the ball goes off into a random space is because the transform it was using for world space becomes its local coordinate when you attach it. I haven't had much luck with keeping world space when attaching or detaching, but its not much of an issue. Great overview of the process, thanks for the tutorial!
Great tutorial! As far as I know, when you attach an object, its location is related to the attachment. So you basically need to set the attached object to its 0,0,0 if you want it to be at the same place of the attachment.
Amazing tutorial. This is how a proper tut should be. ;) Thanks a lot...i know the work this takes. Super useful, will be looking out for more of your videos.
Outstandig turorial! I believe the heavy wheight on the head and a the 60s limit is a problem for acting. I would rather reccord face and body apart: face first and use the audio to guide the performance.
Yeah it is definitely a limitation, and what you mentioned is a popular option. Just keep in mind there a lot of subtle movements with the eyes that are hard to sync perfectly when doing the animations separately. It can lead to the uncanny valley.
you know what makes all face capture look over the top? its the over animated expressions, doesn't look like modern day acting. on the other hands if you look at meet the heavy video, valve hand animated mouth movements soo natural with minimal motions
Thank you so much! Crazy work!!! Respect and thank you for showing all this! Do you create always a new Metahuman Identity or do you reuse old if it is the same Headrig for example?
Thank you! That is a great question - so I actually create a new metahuman identity for every facial animation I use, even if it is the same headrig. Basically every time I roll the Live Link app, I do the face calibration gazes at the beginning, and set up a new identity and performance in UE. I think you might not need to do that, but every time I have tried to use the same identity for multiple performances, I get weird animation bugs.
@@NorthwoodsInteractive yes it is, by the way , when I said I will wait for the advance one I meant that if this one being so amazing is a beginner tutorial, I only can imagine how awesome an advance tutorial would be.
Hello there, this is an extremely useful tutorial! Thanks for showing us everything, from the beginning till the end! I have a question though: How does Move One handle body occlusion? For example, if I turn around and my arms are occluded by my body, I assume move one loses track of them, right? Every clip I have watched displays only full frontal movements, which usually isn't the case if someone wants to create lifelike animations. Thanks for your reply!
As far as I know, it does not do anything special for occlusion; if it is out of sight, it won't know what to do with it and it will likely bug out. It seems to work ok in this animation when I turn away a little, as though it has a little bit of predictive ability. I have not stress tested it, and have always tried to give the camera as good of visibility as possible
@@NorthwoodsInteractive Thanks for replying. Unfortunately, that's what stops me from getting into move one or other similar mocap systems that process video files: When creating lifelike motions, in many occasions the arms will be occluded by the body and the software will lose track of them. Only two or more cameras can solve this.
Not sure if joking but that is in here. I felt a little silly adding it but I actually had to figure that out since I don't normally transfer from an iPhone tp PC
Thank you, glad it was what you were looking for! That's a great question, one I should have answered in the video... I did the whole process twice, once to see if I could actually get the results I wanted using Move One, and another doing the same thing for the tutorial. The first time, it took me about six hours to complete the entire scene. The second... much harder to say since I was making the tutorial at the same time.
Thank you very much. I just checked, and it looks like it actually requires iPhone 12!! I could have sworn it was 11. I will make an annotation to the video. Thanks for pointing that out!
Best way to transfer from iphone to PC is to send it to WA, to youself, I mean to your number. My WA user appears in my WA and I can text me my self. Or upload your live link face take from your phone to G drive, it zip it automatically, then in your PC , download that zip file from your G drive. That simple
Oh yeah that will definitely work. The merahuman animator files can get a little big, so depends on your internet. Most people should be able to transfer using Drive.
PLEASE READ:
- Metahuman Animator REQUIRES iPHONE 12, NOT 11
- If you want my Live Link Face recording and Move One raw animation to follow along with the tutorial, you can get the files on our Discord: discord.gg/thedarkestage
Fantastic!! I think a tutorial like this is what the entire community has been waiting on! Thank you for this.
Awesome, thank you!
Sir make some virtual production tutorial
Incredible and comprehensive tutorial, thank you! As you point out, there are other (shorter) tutorials out there that cover bits and pieces of this info, but nothing as completely packaged and comprehensive as this one you just uploaded! And an excellent summary of the inexpensive body MoCap solutions for us indies ha ha, thanks a ton for doing this!
Really glad to hear it, I was trying to make what I thought people wanted
I've been searching looking and struggling just to see this video today
Thank you so so much
Glad I could help!
The sphere method is what I do as well. The reason the ball goes off into a random space is because the transform it was using for world space becomes its local coordinate when you attach it. I haven't had much luck with keeping world space when attaching or detaching, but its not much of an issue. Great overview of the process, thanks for the tutorial!
Ah yes, I have just learned this! Makes sense.
This is a fantastic video. I'm going to be trying this for my UE5 short film. Thank you!
That's awesome! Would love to see the final product, please consider posting it in the discord!
Head reattachment made easy… Just select the character and check and uncheck Actor Hidden In Game. Great tutorial!!
Brilliant! Thank you!
It’s a really good guide about creating cinematic in ue for beginner as a know) Thank you
Glad it was helpful!
Great tutorial!
As far as I know, when you attach an object, its location is related to the attachment. So you basically need to set the attached object to its 0,0,0 if you want it to be at the same place of the attachment.
Oooh that makes so much sense, thank you!
Amazing tutorial. This is how a proper tut should be. ;) Thanks a lot...i know the work this takes. Super useful, will be looking out for more of your videos.
Amazing tutorial! Taking the cheap but effective approach to performance capture. Really Great stuff. Subscribed !
Thanks for the sub!
Outstandig turorial! I believe the heavy wheight on the head and a the 60s limit is a problem for acting. I would rather reccord face and body apart: face first and use the audio to guide the performance.
Yeah it is definitely a limitation, and what you mentioned is a popular option. Just keep in mind there a lot of subtle movements with the eyes that are hard to sync perfectly when doing the animations separately. It can lead to the uncanny valley.
Superb work. Thanks a lot for all the details.
My pleasure
many thanks friend, amazing tutorial
You are very welcome, glad you like it!
Amazing Tutorial!
Glad it was helpful!
Amazing Tutorial and Make Life Easy For Beginners, Thank You👏
You're very welcome!
Thank you for all the knowledges! Youre the best!
My pleasure!
Thank you so mucch...🎉❤.. been looking ,since so long.
Enjoy!
Great Tutorial
Thank you!
wowww Amazing tutorial!!!
Thank you!
exactly what i was looking for
Awesome, glad to hear. Let me know how the format works for you
Great job!
Thanks!
you know what makes all face capture look over the top? its the over animated expressions, doesn't look like modern day acting. on the other hands if you look at meet the heavy video, valve hand animated mouth movements soo natural with minimal motions
Man those TF2 videos were awesome and I def want to do something in that style
awesome bro
Thanks ✌️
To track just choose tracking grab the Doppler and pick him and he will be tracked
Which part are you referring to?
just what I needed
Awesome, let me know what you think, and if you end up following the whole process!
bro is the character meant to look at the camera? if so, you can constrain the CTRL_EYES to look at camera to make it more realistic.
That is very good advice and something I will likely work into a future tutorial
Thank you so much! Crazy work!!! Respect and thank you for showing all this! Do you create always a new Metahuman Identity or do you reuse old if it is the same Headrig for example?
Thank you!
That is a great question - so I actually create a new metahuman identity for every facial animation I use, even if it is the same headrig. Basically every time I roll the Live Link app, I do the face calibration gazes at the beginning, and set up a new identity and performance in UE. I think you might not need to do that, but every time I have tried to use the same identity for multiple performances, I get weird animation bugs.
@@NorthwoodsInteractive thank you!!!
I watched the whole thing at 1x speed, Ill wait for the advance tutorial.
Awesome, hopefully it was helpful!
@@NorthwoodsInteractive yes it is, by the way , when I said I will wait for the advance one I meant that if this one being so amazing is a beginner tutorial, I only can imagine how awesome an advance tutorial would be.
Hello there, this is an extremely useful tutorial! Thanks for showing us everything, from the beginning till the end!
I have a question though: How does Move One handle body occlusion? For example, if I turn around and my arms are occluded by my body, I assume move one loses track of them, right?
Every clip I have watched displays only full frontal movements, which usually isn't the case if someone wants to create lifelike animations.
Thanks for your reply!
As far as I know, it does not do anything special for occlusion; if it is out of sight, it won't know what to do with it and it will likely bug out. It seems to work ok in this animation when I turn away a little, as though it has a little bit of predictive ability. I have not stress tested it, and have always tried to give the camera as good of visibility as possible
@@NorthwoodsInteractive Thanks for replying. Unfortunately, that's what stops me from getting into move one or other similar mocap systems that process video files: When creating lifelike motions, in many occasions the arms will be occluded by the body and the software will lose track of them. Only two or more cameras can solve this.
Please make a video on what is the best way to transfer data from iphone to pc
Not sure if joking but that is in here. I felt a little silly adding it but I actually had to figure that out since I don't normally transfer from an iPhone tp PC
@@NorthwoodsInteractive I'm using Icloud, sync and then download when it's done.
OCIO use for color
Where would I specify that? In project settings?
hello! this is excellent thank you! how long did the whole process take?
Thank you, glad it was what you were looking for! That's a great question, one I should have answered in the video... I did the whole process twice, once to see if I could actually get the results I wanted using Move One, and another doing the same thing for the tutorial. The first time, it took me about six hours to complete the entire scene. The second... much harder to say since I was making the tutorial at the same time.
This is a great tutorial. Regarding the depth sensor, previously i thought the iPhone 10 was the minimum requirement, now its iPhone 11?
Thank you very much.
I just checked, and it looks like it actually requires iPhone 12!! I could have sworn it was 11. I will make an annotation to the video. Thanks for pointing that out!
@@NorthwoodsInteractive thank you, once again the detailed tutorial is extremely helpful! Glad to know the requirements :)
Best way to transfer from iphone to PC is to send it to WA, to youself, I mean to your number. My WA user appears in my WA and I can text me my self. Or upload your live link face take from your phone to G drive, it zip it automatically, then in your PC , download that zip file from your G drive. That simple
Oh yeah that will definitely work. The merahuman animator files can get a little big, so depends on your internet. Most people should be able to transfer using Drive.