PLEASE READ: - Metahuman Animator REQUIRES iPHONE 12, NOT 11 - If you want my Live Link Face recording and Move One raw animation to follow along with the tutorial, you can get the files on our Discord: discord.gg/thedarkestage Something I thought of later: You could actually have two sequences of the same animation, one for wider shots, and one for closer shots. The closer shots would have heavier smoothing to eliminate any jitter showing up in closeups, while the wider shots would have minimal smoothing, so the body movement and placement in space would be more accurate.
I can not thank you enough for this. The quality of this video, the amount of editing, planning, and just over all quality is insane. This single 2 hours replaces around 20 other videos i have been trying to Frankenstein together to get this workflow to work, but each ones was on different versions, some methods were outdated but the only tutorial i could find. This is a god send and has saved me most likely months on my personal project I am working on. As a motion graphics/VFX artist coming from Houdini/C4D and trying to learn UE through tutorials it has been a huge pain, but this is exactly all the info i have been trying to find. 2k subs is criminally low for this high quality content, thank the youtube algorithm gods for randomly suggesting this! You've got a sub and someone who will direct everyone to this video who is in my same situation. Cheers!
Man I wish I found your channel sooner. You casually answered all my questions and doubts in one video and that’s crazy considering how many tutorials I’ve watched over the years. I’ll be your loyal subscriber from now on🔥😂
Incredible and comprehensive tutorial, thank you! As you point out, there are other (shorter) tutorials out there that cover bits and pieces of this info, but nothing as completely packaged and comprehensive as this one you just uploaded! And an excellent summary of the inexpensive body MoCap solutions for us indies ha ha, thanks a ton for doing this!
I can't even begin to explain how I have enjoyed ever bit of this tutorial, actually one of the best unreal metahuman tutorials I have ever had the pleasure of watch, everything is detailed out so perfectly that you have my upper most respect, I was afraid of unreal engine but just this tutorial made it possible for me to download and start my ventures, thank you so much Sir for this and I promise you that once am done with my project I will tag you.
I just wanted to thank you for doing this tutorial. Truly a staple of how tutorials should be done. I’ve used the concept of this tutorial on other projects and recently decided to follow your project step by step for practice. I’ve finished,but had to go back and work on my lighting. I think lighting can make or break a scene/project and is one of the hardest things to get right. I hope to see more tutorials like this on your channel in the future. Thank you again and god bless.
@@NorthwoodsInteractive Sadly I think this only works when the head detaches during a body animation only. I followed your steps of combining facial mocap and body mocap and was also forced to restart the engine in order to fix it. Please let me know if you find a fix and I'll do the same.
Amazing tutorial. This is how a proper tut should be. ;) Thanks a lot...i know the work this takes. Super useful, will be looking out for more of your videos.
Really useful Regarding the focus target (cube) that is flying when you attach it to the head bone, You should reset the transformation to (0,0,0) and then attach it.
The sphere method is what I do as well. The reason the ball goes off into a random space is because the transform it was using for world space becomes its local coordinate when you attach it. I haven't had much luck with keeping world space when attaching or detaching, but its not much of an issue. Great overview of the process, thanks for the tutorial!
Great tutorial! As far as I know, when you attach an object, its location is related to the attachment. So you basically need to set the attached object to its 0,0,0 if you want it to be at the same place of the attachment.
Absolutely perfect and helpful. Can you make how to integrate this within gameplay ? I would really like to see a seamless integration of cutscene like exactly in the last of us. Can you recreate a simlar style ?
Outstandig turorial! I believe the heavy wheight on the head and a the 60s limit is a problem for acting. I would rather reccord face and body apart: face first and use the audio to guide the performance.
Yeah it is definitely a limitation, and what you mentioned is a popular option. Just keep in mind there a lot of subtle movements with the eyes that are hard to sync perfectly when doing the animations separately. It can lead to the uncanny valley.
Thank you so much! Crazy work!!! Respect and thank you for showing all this! Do you create always a new Metahuman Identity or do you reuse old if it is the same Headrig for example?
Thank you! That is a great question - so I actually create a new metahuman identity for every facial animation I use, even if it is the same headrig. Basically every time I roll the Live Link app, I do the face calibration gazes at the beginning, and set up a new identity and performance in UE. I think you might not need to do that, but every time I have tried to use the same identity for multiple performances, I get weird animation bugs.
Hey, how do you not have a million subscribers? Unbelievable! Do you think that it would be possible to combine this workflow with Stretchsense gloves to get a better result with the hands? Thank you in advance!
Thank you for such a comprehensive tutorial, very well done! I did notice a duplicated link in the description for the budget helmet rig, and was wondering if there was another part for the list we needed to snag to replicate your build 1:1? I see your rig has straight arms vs. the curved arms in your posted list, as well as a mount on the helmet that I can't seem to find. Thank you thank you again for any help, and can't wait to see more!
Hey, thank you for pointing this out! I fixed the link in the description, now it goes to the same two piece arm I have on my helmet, which also comes with the helmet mount. It ends up being more expensive than I remember when I first built mine, which was two years ago. Price now is closer to $70. Please let me know if you build it!
To be super honest, I have spent so much money trying to achieve full body mocap, and right now it’s… kind of a disaster. lol. I even bought multiple iPhones to get move multicam working, and it never did. Looks like you are using their latest, different multicam and it works? Love a video on that. For me, it was a joke. Never worked. And company didn’t seem to GAF. Inertial mocap requires a huge amount of cleanup - regardless of which you choose. You need to be a pro animator. So yeah, I’m… a bit… burned. 😂
Great tutorial and breakdown of the entire cinematic, my only question is, people who are with iphone 11 pro max, what is the alternative of metahuman animator ?
Thank you, glad it was what you were looking for! That's a great question, one I should have answered in the video... I did the whole process twice, once to see if I could actually get the results I wanted using Move One, and another doing the same thing for the tutorial. The first time, it took me about six hours to complete the entire scene. The second... much harder to say since I was making the tutorial at the same time.
Hello there, this is an extremely useful tutorial! Thanks for showing us everything, from the beginning till the end! I have a question though: How does Move One handle body occlusion? For example, if I turn around and my arms are occluded by my body, I assume move one loses track of them, right? Every clip I have watched displays only full frontal movements, which usually isn't the case if someone wants to create lifelike animations. Thanks for your reply!
As far as I know, it does not do anything special for occlusion; if it is out of sight, it won't know what to do with it and it will likely bug out. It seems to work ok in this animation when I turn away a little, as though it has a little bit of predictive ability. I have not stress tested it, and have always tried to give the camera as good of visibility as possible
@@NorthwoodsInteractive Thanks for replying. Unfortunately, that's what stops me from getting into move one or other similar mocap systems that process video files: When creating lifelike motions, in many occasions the arms will be occluded by the body and the software will lose track of them. Only two or more cameras can solve this.
Totally agree. Move One is not an ideal solution, but the point of this video was to show it can still get some good results if some considerations are taken. I will probably try using it for some future work, just because it was so quick.
Thank you very much. I just checked, and it looks like it actually requires iPhone 12!! I could have sworn it was 11. I will make an annotation to the video. Thanks for pointing that out!
This is the first time I've felt like I understand the whole process. Thank you very much. Not much detail on the MoveOne site, so some questions you may or may not be able answer: What is the real minimum computer setup? I'm running an M1 iMac with 16GB of ram. iMac has a good camera. Could that be used in place of the ipad? Any idea when an android version of MoveOne might be released? How much memory/storage space would a phone require for the captures?
Hey I'm glad you found it useful! As far as I know, an iMac won't work, it needs to be an iPad or iPhone since it specifies an iOS version. The website has a waiting list for Android, not sure when it is coming out fortunately.
@@NorthwoodsInteractive Thanks again. It looks to me like using any form of mac to do the processing is a non-starter. There is no mesh to metahuman plugin. You might add that to your pinned note at the top. Bummer.
I'm wondering if you've had a chance to test the new "Audio to Facial Animation" tool in UE 5.5. I wonder if we will get to the point where capturing anything will even be necessary.
I have not tried it yet. Totally possible we get to that point, but right now I am enjoying how relatively easy and cost effective it is to do performance capture
Hey! Thanks very much for this, really appreciate it! I have one important question: does it matter whether you have an Iphone 13 Pro or a 16 Pro for Livelink facial mocap or for Move One body mocap? I`m interested in terms of significant or relevant quality improvements in terms of capture? I`m really debating whether to invest in an Iphone 16 Pro due to better cameras or would a 13 Pro do the job just fine? Thanks!
I do not know if there is a difference, that is a good question. I used an iPhone 13 and it seems to do pretty well. I am just guessing, but I imagine the quality improvements would be small, if any. It's still 30 fps, and I don't think the resolution of the camera makes too much of a difference. There might be a difference in depth sensor resolution, but I am not sure.
you know what makes all face capture look over the top? its the over animated expressions, doesn't look like modern day acting. on the other hands if you look at meet the heavy video, valve hand animated mouth movements soo natural with minimal motions
Not sure if joking but that is in here. I felt a little silly adding it but I actually had to figure that out since I don't normally transfer from an iPhone tp PC
@@NorthwoodsInteractive yes it is, by the way , when I said I will wait for the advance one I meant that if this one being so amazing is a beginner tutorial, I only can imagine how awesome an advance tutorial would be.
I don't have any courses. I haven't done any full courses myself, but I know Bad Decisions Studio had a completely free beginner course on UA-cam. For Metahumans, Feeding_Wolves and JSFilmz
@NorthwoodsInteractive So your Sequencer timeline is 24 fps, the body mocap is at 60 fps and the facial mocap is at 30 fps. Does it matter that we have different fps that don't match?
Is there a way to use props during the recording or will it not read? Like let’s say my scene starts with someone sharpening a knife, will this process work for something like that?
Only 21% of users across the world are using iPhones, more than 70% are using android, im willing to bet those figures are similar to UE, which means tutorials like this are excluding a huge majority
I'm an android user myself, always have been. I got an iPhone 13 literally just to use with Metahuman Animator. It's a low barrier of entry considering what you get out of it. You can pick up a used iPhone 12 for like $200 backmarket.com
Does anybody know at what time he explained how to flip the rokoko headrig holder for the phone? Apologies for my laziness, but 2 hours is a large amount of time to look through
I don't show how to do it in the video. You need an Allen wrench, then all you need to do is unscrew the phone holder part and rotate it so it is facing away from your face, and screw it back together
@@NorthwoodsInteractive I really appreciate the quick response! Just tried exactly what you said and got it working. Keep up the wonderful work, it'll eventually pay off
Sir if I may ask at 28:57 you deleted the Metahuman control rig for body AND at 1:21:32 it appears again when you are smoothing out jitters, how does it come again
Best way to transfer from iphone to PC is to send it to WA, to youself, I mean to your number. My WA user appears in my WA and I can text me my self. Or upload your live link face take from your phone to G drive, it zip it automatically, then in your PC , download that zip file from your G drive. That simple
Oh yeah that will definitely work. The merahuman animator files can get a little big, so depends on your internet. Most people should be able to transfer using Drive.
PLEASE READ:
- Metahuman Animator REQUIRES iPHONE 12, NOT 11
- If you want my Live Link Face recording and Move One raw animation to follow along with the tutorial, you can get the files on our Discord: discord.gg/thedarkestage
Something I thought of later: You could actually have two sequences of the same animation, one for wider shots, and one for closer shots. The closer shots would have heavier smoothing to eliminate any jitter showing up in closeups, while the wider shots would have minimal smoothing, so the body movement and placement in space would be more accurate.
I can not thank you enough for this. The quality of this video, the amount of editing, planning, and just over all quality is insane. This single 2 hours replaces around 20 other videos i have been trying to Frankenstein together to get this workflow to work, but each ones was on different versions, some methods were outdated but the only tutorial i could find. This is a god send and has saved me most likely months on my personal project I am working on. As a motion graphics/VFX artist coming from Houdini/C4D and trying to learn UE through tutorials it has been a huge pain, but this is exactly all the info i have been trying to find. 2k subs is criminally low for this high quality content, thank the youtube algorithm gods for randomly suggesting this! You've got a sub and someone who will direct everyone to this video who is in my same situation. Cheers!
I'm really glad to hear this was so helpful! Thanks for watching :) I would love to see what you end up making, consider sharing on our discord!
@@NorthwoodsInteractive will do!
That's so useful for the community of filmmakers. You bring it all together. Many thx 🙂
This has helped me so much I can't express how much. You are professional, precise and thorough. I can't believe you only have 3.54k subs.
This tutorial is insane timing… this has saved me so much stress with my shortfilm
Man I wish I found your channel sooner. You casually answered all my questions and doubts in one video and that’s crazy considering how many tutorials I’ve watched over the years. I’ll be your loyal subscriber from now on🔥😂
Incredible and comprehensive tutorial, thank you! As you point out, there are other (shorter) tutorials out there that cover bits and pieces of this info, but nothing as completely packaged and comprehensive as this one you just uploaded! And an excellent summary of the inexpensive body MoCap solutions for us indies ha ha, thanks a ton for doing this!
Really glad to hear it, I was trying to make what I thought people wanted
I've been searching looking and struggling just to see this video today
Thank you so so much
Glad I could help!
I can't even begin to explain how I have enjoyed ever bit of this tutorial, actually one of the best unreal metahuman tutorials I have ever had the pleasure of watch, everything is detailed out so perfectly that you have my upper most respect, I was afraid of unreal engine but just this tutorial made it possible for me to download and start my ventures, thank you so much Sir for this and I promise you that once am done with my project I will tag you.
This... This is what we were waiting for. Massive thanks guiding me through this maze.
Dude this channel deserves million subscribers
I just wanted to thank you for doing this tutorial. Truly a staple of how tutorials should be done. I’ve used the concept of this tutorial on other projects and recently decided to follow your project step by step for practice. I’ve finished,but had to go back and work on my lighting. I think lighting can make or break a scene/project and is one of the hardest things to get right. I hope to see more tutorials like this on your channel in the future. Thank you again and god bless.
Thanks for guiding us film makers just starting out! Metahuman animator and Move AI are indeed the beginners/ indei solution
Thats EXACTLY what I was looking for. Great video!
This is amazing, thank you! First one I have found that covers everything from start to finish for this topic and in perfect detail.
Head reattachment made easy… Just select the character and check and uncheck Actor Hidden In Game. Great tutorial!!
Brilliant! Thank you!
@@NorthwoodsInteractive Sadly I think this only works when the head detaches during a body animation only. I followed your steps of combining facial mocap and body mocap and was also forced to restart the engine in order to fix it. Please let me know if you find a fix and I'll do the same.
Wow I definitely need more than one thumb up! What I great session, really well explained and always on point! Thanks for sharing your workflow!
This is awesome, thank you so much! Can’t wait to try this out.
I make a 3d animation movie using Metahuman. This information is helpful. Thank you so much.
Awesome video mate! Ur a hella talented guy it’s awesome to see the sick content ur churning out!
Fantastic!! I think a tutorial like this is what the entire community has been waiting on! Thank you for this.
Awesome, thank you!
Sir make some virtual production tutorial
Amazing tutorial. This is how a proper tut should be. ;) Thanks a lot...i know the work this takes. Super useful, will be looking out for more of your videos.
Really useful
Regarding the focus target (cube) that is flying when you attach it to the head bone, You should reset the transformation to (0,0,0) and then attach it.
Yep, that's the ticket!
This is a fantastic video. I'm going to be trying this for my UE5 short film. Thank you!
That's awesome! Would love to see the final product, please consider posting it in the discord!
It’s a really good guide about creating cinematic in ue for beginner as a know) Thank you
Glad it was helpful!
The sphere method is what I do as well. The reason the ball goes off into a random space is because the transform it was using for world space becomes its local coordinate when you attach it. I haven't had much luck with keeping world space when attaching or detaching, but its not much of an issue. Great overview of the process, thanks for the tutorial!
Ah yes, I have just learned this! Makes sense.
Thank you for this generous and passionate tutorial, best step to step Tutorial ever!🥰
Amazing Tutorial and Make Life Easy For Beginners, Thank You👏
You're very welcome!
This is excellent! Thank you for a great video!👍
This was awesome. Thank for this tutorial
Thank you very very much for clean tutorial. All seconds are great...
amazing job and congratulations for the result. I only missed the animation of the clothes, which is not difficult to do with chaos cloth.
Yeah i need to do some cloth sim
This was a great tutorial, thanks!
this is gold! thank you for that. will spread the word.
exactly what i was looking for
Awesome, glad to hear. Let me know how the format works for you
Great tutorial!
As far as I know, when you attach an object, its location is related to the attachment. So you basically need to set the attached object to its 0,0,0 if you want it to be at the same place of the attachment.
Oooh that makes so much sense, thank you!
I just saw the mocap helmets, sorry for my earlier post lol
Amazing tutorial! Taking the cheap but effective approach to performance capture. Really Great stuff. Subscribed !
Thanks for the sub!
Amazing tutorial, immediate like and subscribe. Gonna get started on this right away.
Awesome to hear! Please consider sharing your work in our Discord!
Absolutely perfect and helpful. Can you make how to integrate this within gameplay ? I would really like to see a seamless integration of cutscene like exactly in the last of us. Can you recreate a simlar style ?
many thanks friend, amazing tutorial
You are very welcome, glad you like it!
Thank you so much for your work and u share so helpfull and im so hype by what i can do now
thanks brooooo, automatically i joined to your channel
48 seconds, subscribed.
Outstandig turorial! I believe the heavy wheight on the head and a the 60s limit is a problem for acting. I would rather reccord face and body apart: face first and use the audio to guide the performance.
Yeah it is definitely a limitation, and what you mentioned is a popular option. Just keep in mind there a lot of subtle movements with the eyes that are hard to sync perfectly when doing the animations separately. It can lead to the uncanny valley.
I really learnt a lot......🤜🤛
Superb work. Thanks a lot for all the details.
My pleasure
Thank you so much! Crazy work!!! Respect and thank you for showing all this! Do you create always a new Metahuman Identity or do you reuse old if it is the same Headrig for example?
Thank you!
That is a great question - so I actually create a new metahuman identity for every facial animation I use, even if it is the same headrig. Basically every time I roll the Live Link app, I do the face calibration gazes at the beginning, and set up a new identity and performance in UE. I think you might not need to do that, but every time I have tried to use the same identity for multiple performances, I get weird animation bugs.
@@NorthwoodsInteractive thank you!!!
Hopefully more phone manufacturers will start adding LIDAR to their phones as well. The iPhone exclusivity sucks 😬
Hey, how do you not have a million subscribers? Unbelievable! Do you think that it would be possible to combine this workflow with Stretchsense gloves to get a better result with the hands? Thank you in advance!
Hey thanks! I am definitely interested in exploring this, and will look into it.
Thank you for such a comprehensive tutorial, very well done! I did notice a duplicated link in the description for the budget helmet rig, and was wondering if there was another part for the list we needed to snag to replicate your build 1:1? I see your rig has straight arms vs. the curved arms in your posted list, as well as a mount on the helmet that I can't seem to find. Thank you thank you again for any help, and can't wait to see more!
Hey, thank you for pointing this out! I fixed the link in the description, now it goes to the same two piece arm I have on my helmet, which also comes with the helmet mount. It ends up being more expensive than I remember when I first built mine, which was two years ago. Price now is closer to $70. Please let me know if you build it!
@@NorthwoodsInteractive You're the man, thank you so much! Will do!
Please make more videos like this✌✌
masterclass... thanks a lot man!
Exactly what i what i was looking for! Thanks :)
I have already an iPhone,Can i ask which iPad model do we need as minumum requirement?
I am not sure, but their app page just says iOS 16.4 or later
Thx a lot 👍🏻
Thank you so mucch...🎉❤.. been looking ,since so long.
Enjoy!
Just one word cool
To be super honest, I have spent so much money trying to achieve full body mocap, and right now it’s… kind of a disaster. lol. I even bought multiple iPhones to get move multicam working, and it never did. Looks like you are using their latest, different multicam and it works? Love a video on that. For me, it was a joke. Never worked. And company didn’t seem to GAF. Inertial mocap requires a huge amount of cleanup - regardless of which you choose. You need to be a pro animator. So yeah, I’m… a bit… burned. 😂
Thank you for all the knowledges! Youre the best!
My pleasure!
Great tutorial and breakdown of the entire cinematic, my only question is, people who are with iphone 11 pro max, what is the alternative of metahuman animator ?
Unfortunately you need iPhone 12 or newer, I misspoke in the video
Thanks! good vid. lets hope they don't screw up all that wonderful quixel content too badly with the switch to fab.
hello! this is excellent thank you! how long did the whole process take?
Thank you, glad it was what you were looking for! That's a great question, one I should have answered in the video... I did the whole process twice, once to see if I could actually get the results I wanted using Move One, and another doing the same thing for the tutorial. The first time, it took me about six hours to complete the entire scene. The second... much harder to say since I was making the tutorial at the same time.
wowww Amazing tutorial!!!
Thank you!
Amazing Tutorial!
Glad it was helpful!
Great job!
Thanks!
just what I needed
Awesome, let me know what you think, and if you end up following the whole process!
very good job
thanks full!
Great tutorial!.. does the facial animator work with iPhone XR at all?
No, it will only work with iphone 12 or newer
Hello there, this is an extremely useful tutorial! Thanks for showing us everything, from the beginning till the end!
I have a question though: How does Move One handle body occlusion? For example, if I turn around and my arms are occluded by my body, I assume move one loses track of them, right?
Every clip I have watched displays only full frontal movements, which usually isn't the case if someone wants to create lifelike animations.
Thanks for your reply!
As far as I know, it does not do anything special for occlusion; if it is out of sight, it won't know what to do with it and it will likely bug out. It seems to work ok in this animation when I turn away a little, as though it has a little bit of predictive ability. I have not stress tested it, and have always tried to give the camera as good of visibility as possible
@@NorthwoodsInteractive Thanks for replying. Unfortunately, that's what stops me from getting into move one or other similar mocap systems that process video files: When creating lifelike motions, in many occasions the arms will be occluded by the body and the software will lose track of them. Only two or more cameras can solve this.
Totally agree. Move One is not an ideal solution, but the point of this video was to show it can still get some good results if some considerations are taken. I will probably try using it for some future work, just because it was so quick.
@@NorthwoodsInteractive Yes your point is valid and is proven through your video! Thanks again for showing us!
This is a great tutorial. Regarding the depth sensor, previously i thought the iPhone 10 was the minimum requirement, now its iPhone 11?
Thank you very much.
I just checked, and it looks like it actually requires iPhone 12!! I could have sworn it was 11. I will make an annotation to the video. Thanks for pointing that out!
@@NorthwoodsInteractive thank you, once again the detailed tutorial is extremely helpful! Glad to know the requirements :)
Great Tutorial
Thank you!
Great stuff! Cheers.
This is the first time I've felt like I understand the whole process. Thank you very much.
Not much detail on the MoveOne site, so some questions you may or may not be able answer:
What is the real minimum computer setup? I'm running an M1 iMac with 16GB of ram.
iMac has a good camera. Could that be used in place of the ipad?
Any idea when an android version of MoveOne might be released?
How much memory/storage space would a phone require for the captures?
Hey I'm glad you found it useful! As far as I know, an iMac won't work, it needs to be an iPad or iPhone since it specifies an iOS version. The website has a waiting list for Android, not sure when it is coming out fortunately.
@@NorthwoodsInteractive Thanks again. It looks to me like using any form of mac to do the processing is a non-starter. There is no mesh to metahuman plugin. You might add that to your pinned note at the top. Bummer.
Amazing! Had to stop watching to ensure that I don't run off buying iphones, head rigs etc. 😅
I'm wondering if you've had a chance to test the new "Audio to Facial Animation" tool in UE 5.5. I wonder if we will get to the point where capturing anything will even be necessary.
I have not tried it yet. Totally possible we get to that point, but right now I am enjoying how relatively easy and cost effective it is to do performance capture
amazing
bro is the character meant to look at the camera? if so, you can constrain the CTRL_EYES to look at camera to make it more realistic.
That is very good advice and something I will likely work into a future tutorial
To track just choose tracking grab the Doppler and pick him and he will be tracked
Which part are you referring to?
awesome bro
Thanks ✌️
Hey! Thanks very much for this, really appreciate it! I have one important question: does it matter whether you have an Iphone 13 Pro or a 16 Pro for Livelink facial mocap or for Move One body mocap? I`m interested in terms of significant or relevant quality improvements in terms of capture? I`m really debating whether to invest in an Iphone 16 Pro due to better cameras or would a 13 Pro do the job just fine? Thanks!
I do not know if there is a difference, that is a good question. I used an iPhone 13 and it seems to do pretty well. I am just guessing, but I imagine the quality improvements would be small, if any. It's still 30 fps, and I don't think the resolution of the camera makes too much of a difference. There might be a difference in depth sensor resolution, but I am not sure.
you know what makes all face capture look over the top? its the over animated expressions, doesn't look like modern day acting. on the other hands if you look at meet the heavy video, valve hand animated mouth movements soo natural with minimal motions
Man those TF2 videos were awesome and I def want to do something in that style
20:11 "And they're all completely free with Unreal Engine" - Only until the end of 2024 😬
Please make a video on what is the best way to transfer data from iphone to pc
Not sure if joking but that is in here. I felt a little silly adding it but I actually had to figure that out since I don't normally transfer from an iPhone tp PC
@@NorthwoodsInteractive I'm using Icloud, sync and then download when it's done.
I watched the whole thing at 1x speed, Ill wait for the advance tutorial.
Awesome, hopefully it was helpful!
@@NorthwoodsInteractive yes it is, by the way , when I said I will wait for the advance one I meant that if this one being so amazing is a beginner tutorial, I only can imagine how awesome an advance tutorial would be.
I’m trying to replicate your diy helmet and I can’t find the vertical elbow piece between the phone mount and the long helmet rod?
a.co/d/7Y4qkRv
Apparently I had the wrong link in the description
do you have any complete course? I would like to study with you, but if you dont have, please, can you say a course for us study?
I don't have any courses. I haven't done any full courses myself, but I know Bad Decisions Studio had a completely free beginner course on UA-cam. For Metahumans, Feeding_Wolves and JSFilmz
@NorthwoodsInteractive
So your Sequencer timeline is 24 fps, the body mocap is at 60 fps and the facial mocap is at 30 fps. Does it matter that we have different fps that don't match?
It's a great question, but I haven't seen any problem so far
Is there a way to use props during the recording or will it not read? Like let’s say my scene starts with someone sharpening a knife, will this process work for something like that?
You can use props, they just might occlude your hands. I think with this method though you are probably best just pantomiming it
Is it possible to do face tracking on metahuman that i created by myself using 3D mesh of my face?
Absolutely!
I installed the App (move one) on my Iphone 12 but it requires my Google or Apple password account !!! Why ???
Only 21% of users across the world are using iPhones, more than 70% are using android, im willing to bet those figures are similar to UE, which means tutorials like this are excluding a huge majority
I'm an android user myself, always have been. I got an iPhone 13 literally just to use with Metahuman Animator. It's a low barrier of entry considering what you get out of it. You can pick up a used iPhone 12 for like $200 backmarket.com
Does anybody know at what time he explained how to flip the rokoko headrig holder for the phone? Apologies for my laziness, but 2 hours is a large amount of time to look through
I don't show how to do it in the video. You need an Allen wrench, then all you need to do is unscrew the phone holder part and rotate it so it is facing away from your face, and screw it back together
@@NorthwoodsInteractive I really appreciate the quick response! Just tried exactly what you said and got it working. Keep up the wonderful work, it'll eventually pay off
Sir if I may ask at 28:57 you deleted the Metahuman control rig for body AND at 1:21:32 it appears again when you are smoothing out jitters, how does it come again
Right click on the body in the Sequencer and select Bake to Control Rig
@NorthwoodsInteractive thank you for your kind response
Whats version of your iphone 13? Standart, pro or pro max?
Just standard
We’ll never know what his real voice sounds like
OCIO use for color
Where would I specify that? In project settings?
Best way to transfer from iphone to PC is to send it to WA, to youself, I mean to your number. My WA user appears in my WA and I can text me my self. Or upload your live link face take from your phone to G drive, it zip it automatically, then in your PC , download that zip file from your G drive. That simple
Oh yeah that will definitely work. The merahuman animator files can get a little big, so depends on your internet. Most people should be able to transfer using Drive.
question, this is not meant for android users right??
As of now, Move One is only iOS, but they have android coming soon.
Great to hear that
I feel uncomfortable to watch this tutorial for free
man i cant download metahuman plugin :C
Try finding it in the Unreal Engine Marketplace
@@NorthwoodsInteractive yeah, is not in my vault, seen lotta people with same issue :C