would like to see a detailed tutorial from start to finish. From a blank project to final render. Not just face mocap but also body animations. It is often quite complex and confusing as to the number of steps and different interfaces required. Many of the “tutorials” do not explain properly and often skip steps or start with “ I have already set up the …”. Also, the accuracy of the lip sync is often off and letters like B and P are not done with closed lips etc. Also how to animate the eyes to be more focused on the camera when needed. I would ideally like a single tutorial instead of having to watch dozens of videos each doing something different and not always working as expected.
Tat's the whole point, don't you think it's strange that THEY ALL suffer from the same total arrogance and stupid jokes and half-info, plus sponsoring they're mates crap for you to buy and to get stuck, whilst they make dallas
just a note - your metahuman has a bowling ball heavy lower face because you scanned yourself with a big beard. the solver doesn't know the difference between hair and skin so it adds your beard as geometry, covers it in skin, and then you add your beard on top of that. You'd have a much better match if you went clean shaven for a scan-2-metahuman/animator creation, and then do your captures after that. From that period you should be able to use animator while having your beard and then apply the animation data to your non-bulbous metahuman, just as you would apply it to anyone other than you.
😁 oh yeah. depth cam cant map objects its just make depth of any object. you cant get movement with out lidar . then vfx films games just use depth cam for make full body movements not full body suit with trcking point for lidar
Great tutorial man! This does skim over a lot of important details though. For example, what you showed here is only enough to animate the face and play it back, nothing else. Would've been nice if you talked about baking to control rig to do some tweaks or even how to animate the face with a performance (you're final export showed body and face). Also exporting the animation the way you did will only work on your specific metahuman since you specified that mesh. If you exported the metahuman skeleton instead, it could work on any metahuman.
Exactly ... instead of all the promoting there f4ggoty porn friends add ons and soft/hardware , give us info that's realistic and compleet ...spice girl with a beard
8:22 - Performance to existing MetaHuman downloaded from Quixel Bridge / Created from MetaHuman Creator / Custom MH. Thank you!!! I was looking for that everywhere.
I clicked on the face and then clicked on the asset in animation mode. And I clicked on the animation sequence (performance), but the video does not play. What's the problem?
Seems like Unreal engine's native support of Mocap is only available using iphone. (besides the professional equipment) am I right? I saw some webcam version of face mocap software but seems not smooth. Buying some used iphone is the only cheap way to use face mocap I guess
I don't get it all of people promoting how amazing this feature is and literally no ones talking about that is completely unsuable for complex animations because of huge bugs with head rotation, body disconnection, baking etc.
I wonder if this would work with any other 3D model with enough blend shapes. This would be a really nice and easy way to animate not just the meta humans. BUTTTTT Jordie... im surprised you guys are only noticing this now?!?!?!? this live link has been a thing since UE4.24 or 26 but and meta humans since like beta 5.0 lol BUT you still made it simple and fast to ingest and start creating thank you :)
Very COOL! However I have been looking for a way to do this to a character that is not a Metahuman, A character that i have Modeled in Cinema 4D... What is the best way to do this face MoCap to a character like that... ?
There was a company called faceshift that Apple bought up. That software is inbeded in all iphones. Thats why that works. Its not unreal doing the heavy lifting. Prior to that facechift worked with 50 USD kinect.
That's how you talk, you show a lot of great things, but I haven't seen a single video of yours with a normal explanation of how to get the desired effects.
Please, I have a problem! after installing Metahuman plugin from the market, and when I start UE5.3 I cant found plugin in the list !!! any solution thank you
Thanks for the great tutorial, but could anybody help, why animation doesn’t work when i track it to the face mesh in sequencer, after all the steps have been done? Cannot find any info how to solve it
I love the section where you explain how movie special effects were created. and i have one quick question. Is a vfx certificate required for employment in a company like MPC, SUNRISE WETA, etc?
Absolutely not required. You "just" need to be good at what you're doing. They don't care about a degree. You can learn everything you need through UA-cam, google, online courses and a lot of trial and error on your own. The one advantage a school might bring you is maybe contacts and connections into the industry. But generally I wouldn't recommend studying vfx, except it's at one of the top schools... and they are pretty expensive.
Dude what you're talking about, this was possible with live link app on iphone years ago - but it was live and you had to record the whole thing and put it together in sequencer and bake an animation .... I made myself a lot of animations last year, same quality
live link face was available, yes. but the way the animation data is processed and the number of facial data points is far different. The use of the lidar is the key change, and the final product here is WAY better. I too animated lots of faces with the live link face and there is simply no comparison.
Sorry, might be a Noob question... Does this app mocap data works with only metahuman characters? Or we can use it on our own characters created using maya/zbrush? Because everyone one is just mentioning metahuman characters only.
"Don't need expensive gear", "I'm using my iPhone" Last time I checked iPhone's (with LiDAR, like the iPhone 11) don't fall in the category "not expensive".
Sir I have a sci-fi story. I want to make it as a short film. But I have no money and no software. Can you tell me how to do it ? Please any suggestions me sir
What do you think about the possibility of using Apple's upcoming Vision Pro for virtual production? Could it make LED walls obsolete? Could actors and crew use this to work remotely?
Hey, I actually would greatly appreciate the help, can we create a metahuman with the live link video data, like when we don't already have a metahuman of the person in the video but want to create a metahuman with the help of the live link video..
Unreal Engine 5.2 brings some awesome new features! I know what to do this weekend! 😁 What are you weekend plans?
Making blender animations! 😂
glad to see you got it working
you dont know a ffing thing about it lol
Okay Sure@@peter486
2 minutes of tutorial and 7 mins of sponsorship 😇
"We don't need expensive gear"
Also him: "I'm using my iPhone right now"
🗿
Yeah... saying "expensive" depends on a person's perspective...
iPhones in general are expensive... And not very durable 😅
if you think an iphone to do facial mocap is expensive it's best you don't go into facial mocap.
iphones are fucking cheaps nowaday we arent in 2010 area anymore just buy an old one lol...
@@ZinzinnovichYou’d need one capable of facial recognition. So don’t buy too old. Pretty much any model after and including the X should work.
+100 for the neat folder structure and detailed naming of assets! :D Great tutorial!
would like to see a detailed tutorial from start to finish. From a blank project to final render. Not just face mocap but also body animations. It is often quite complex and confusing as to the number of steps and different interfaces required. Many of the “tutorials” do not explain properly and often skip steps or start with “ I have already set up the …”. Also, the accuracy of the lip sync is often off and letters like B and P are not done with closed lips etc. Also how to animate the eyes to be more focused on the camera when needed. I would ideally like a single tutorial instead of having to watch dozens of videos each doing something different and not always working as expected.
Tat's the whole point, don't you think it's strange that THEY ALL suffer from the same total arrogance and stupid jokes and half-info, plus sponsoring they're mates crap for you to buy and to get stuck, whilst they make dallas
have you found one yet? looking for the same
@@LiterallyJord me too did you guys get any thing
just a note - your metahuman has a bowling ball heavy lower face because you scanned yourself with a big beard. the solver doesn't know the difference between hair and skin so it adds your beard as geometry, covers it in skin, and then you add your beard on top of that. You'd have a much better match if you went clean shaven for a scan-2-metahuman/animator creation, and then do your captures after that. From that period you should be able to use animator while having your beard and then apply the animation data to your non-bulbous metahuman, just as you would apply it to anyone other than you.
I have a massive mustache and metahuman thinks I have an overbite. Will have to shave it but it should grow back in one to two days.
My nba2k player always looked ridiculous because of that
@@44punk yeah, that’s how Freddie mercury got those big front chompers. He had that mustache when his face was created.
You guys are great, you show from start to finish for a tutorial at a followable pace.
how do i fix the floating head
YESS CINECOM UPLOADS ALWAYS MAKE MY DAY!
Dont forget, this is just the first step... in a few months, years this is gonna be even better.... amazing!
Hey Jordy, AKA MASTER ARTIST 😁 your Skilshare courses are awesome man . Funny and straight to the point 😃 Thank you so much.
This is a game-changer for face mocap! 🙌
You dont need LiDAR, just true depth face Cam on iPhone
😁 oh yeah. depth cam cant map objects its just make depth of any object. you cant get movement with out lidar . then vfx films games just use depth cam for make full body movements not full body suit with trcking point for lidar
@@SggQpwpqpq-vq3ds knew that , was just related to metahuman animator and the mistake included in this particular video
TrueDepth camera is lidar bruh
That’s just lidar
Great tutorial man! This does skim over a lot of important details though. For example, what you showed here is only enough to animate the face and play it back, nothing else. Would've been nice if you talked about baking to control rig to do some tweaks or even how to animate the face with a performance (you're final export showed body and face). Also exporting the animation the way you did will only work on your specific metahuman since you specified that mesh. If you exported the metahuman skeleton instead, it could work on any metahuman.
This is correct, I am trying to figure out how to connect the body and the head together without the head floating away
Exactly ... instead of all the promoting there f4ggoty porn friends add ons and soft/hardware , give us info that's realistic and compleet ...spice girl with a beard
I never thought I needed to see jordy with shredded abs and chest 😂
8:22 - Performance to existing MetaHuman downloaded from Quixel Bridge / Created from MetaHuman Creator / Custom MH. Thank you!!! I was looking for that everywhere.
this is mind blowing, does it work with android as well as long as it has LIDAR?
that was great nice bruh . love from indonesia
is it possible to do in a android phone?
Thanks for showing...I just wish it was even easier...so many menus to Open, so many options to choose wrong, so many steps to remember
I thought i was alone in this lost adventure 🤣🤣🤣🤣
Thank you. This was really helpful
Thanks for the quick funny workflow. ❤😂🎉
I have got problem, Time: 8:28 I don't see it in the drop-down tab. I cannot choose AS Metahuman Performance. And in the folder I have it.
Thanks for uploading video 😊
I clicked on the face and then clicked on the asset in animation mode. And I clicked on the animation sequence (performance), but the video does not play. What's the problem?
Same problem mate
I wouldn't call this high quality but its better than what we used to have.
*Is it anyway to do this live with your phone?*
Hi can this method work for Android phone.
Thank you
Unreal looks almost real now!
Happy Friday!
Seems like Unreal engine's native support of Mocap is only available using iphone. (besides the professional equipment) am I right?
I saw some webcam version of face mocap software but seems not smooth.
Buying some used iphone is the only cheap way to use face mocap I guess
Which link is for how to create a meta human?
So good, thanks!
Love the videos!
you guys are the best
So cool!
I don't get it all of people promoting how amazing this feature is and literally no ones talking about that is completely unsuable for complex animations because of huge bugs with head rotation, body disconnection, baking etc.
Autor:
We don't need expensive gear.
Autor also:
I'm using my i-phone
Still waiting for the Body Motion Capture 😅
I wonder if this would work with any other 3D model with enough blend shapes. This would be a really nice and easy way to animate not just the meta humans. BUTTTTT Jordie... im surprised you guys are only noticing this now?!?!?!? this live link has been a thing since UE4.24 or 26 but and meta humans since like beta 5.0 lol BUT you still made it simple and fast to ingest and start creating thank you :)
How did you fiwx the floating head?
I cannot figure it out for the life of me.
Do you have on for combining body and facial like you did here?
how do i fix my separated head?
Very COOL! However I have been looking for a way to do this to a character that is not a Metahuman, A character that i have Modeled in Cinema 4D... What is the best way to do this face MoCap to a character like that... ?
is pasible only with iphone?
Are there significant improvements for the later editions of phones?
It was so unusual to see your neutral face since you are mostly very animated! Love your videos!!
I would go with an open lip when picking an open frame .
Any iPhone 11 or up works?
Is this possible with Android? 😅
There was a company called faceshift that Apple bought up. That software is inbeded in all iphones. Thats why that works. Its not unreal doing the heavy lifting.
Prior to that facechift worked with 50 USD kinect.
nice tut.why did you mention that requires the lidar sensor when you havent use the lidar sensor in the video?
is automatic used by the iphone when he records on iphone with metahuman app....it gives all the datas and not just a normal video.
Wait a sec iPhone 11 doesn't have lidar sensor🤔
God damn it, another iPhone tracking video. I'M NOT BUYING AN IPHONE JUST TO TRY SOMETHING FOR 5 SECONDS!
any chance of getting this to work on mac? metahuman plugin is windows only 😢
Crazy! Could this be used for lip sync and just an image of a face?
Metahuman Animator does not support iPhone 11, only 12 and above.
How do you add clothes?
Hi can I use blender or unreal engine on ipad. if yes then which one is the best ipad to use it for rendering and editing in blender / unreal engine
my only question is
can you do a full body for the process of creating a characters movement
We don’t need expensive gear, just an iPhone 📲
😅😅 nice tutorial.. Thanks
Nightmare fuel!
haven't you skipped the FIT TEETH part ? or is it alright to bypass that button
Thank you, but how I did shape keys in fast way any program for that??
is this "live link" is on pc? using a phone is a little bit unprofessional and cheap.
Here's me hoping that one day there will be a way to utilise these types of technologies, in a much simple way.
my metahuman dont have hair when i include in mi project
Same problem for me too
Everyone... with an iPhone.
Can't wait for a body mocap with the same 4D tech.
so many steps. I think this can still be made more efficient
That's how you talk, you show a lot of great things, but I haven't seen a single video of yours with a normal explanation of how to get the desired effects.
*yup this world moves so fast*
i like 70% of this video is random ads
Nice ✨✨
Please, I have a problem! after installing Metahuman plugin from the market, and when I start UE5.3 I cant found plugin in the list !!! any solution thank you
did u ever made games?
Excuse me, what is the price of the suit and where can it be purchased? What if you can only record with an Apple cell phone? greetings from Mexico
I can't upload the live link files into UE5. When I try to drag it intto the content folder, it says error. Any idea whats up?
And just like that……. I need to watch again. Lol
So fun!
Does this only work on metahumans or can i use my own 3d model with this face mocap
Wait, I thought that Iphone X or greater was all that was necessary for using Live Link?
Is there a way to animate a non Metahuman face?
What app do you use with an android? I have a Samnsung galaxy S23 ultra
Lidar sensor on IPhone 11.????
Thanks very useful
Thanks for the great tutorial, but could anybody help, why animation doesn’t work when i track it to the face mesh in sequencer, after all the steps have been done? Cannot find any info how to solve it
Is there a way we can replicate using pixel 4xl ir camera?
Clickbait. They want you to buy an "iphone".
I just had to
Like just every good MoCap
I love the section where you explain how movie special effects were created.
and i have one quick question.
Is a vfx certificate required for employment in a company like MPC, SUNRISE WETA, etc?
Absolutely not required. You "just" need to be good at what you're doing. They don't care about a degree. You can learn everything you need through UA-cam, google, online courses and a lot of trial and error on your own. The one advantage a school might bring you is maybe contacts and connections into the industry. But generally I wouldn't recommend studying vfx, except it's at one of the top schools... and they are pretty expensive.
Dude what you're talking about, this was possible with live link app on iphone years ago - but it was live and you had to record the whole thing and put it together in sequencer and bake an animation .... I made myself a lot of animations last year, same quality
live link face was available, yes. but the way the animation data is processed and the number of facial data points is far different. The use of the lidar is the key change, and the final product here is WAY better. I too animated lots of faces with the live link face and there is simply no comparison.
Sorry, might be a Noob question... Does this app mocap data works with only metahuman characters? Or we can use it on our own characters created using maya/zbrush? Because everyone one is just mentioning metahuman characters only.
Is this Unreal Engine guide or a promo video?
Promo video with a course commercial at the end lmao
How to fix the floating head?
Does this work on ipad pro with true depth sensor too?
"Don't need expensive gear", "I'm using my iPhone"
Last time I checked iPhone's (with LiDAR, like the iPhone 11) don't fall in the category "not expensive".
Sir I have a sci-fi story. I want to make it as a short film. But I have no money and no software. Can you tell me how to do it ? Please any suggestions me sir
Used livelink, but saying unsupported video format :/ Im using a 15 pro
Wheres the lonk on how to create custom metahuman
“We don’t need expensive gear” using an iPhone
I've got the problem, that the "MetaHuman Plugin" is unavailable for me. Anyone else with this problem?
What do you think about the possibility of using Apple's upcoming Vision Pro for virtual production? Could it make LED walls obsolete? Could actors and crew use this to work remotely?
Hey, I actually would greatly appreciate the help, can we create a metahuman with the live link video data, like when we don't already have a metahuman of the person in the video but want to create a metahuman with the help of the live link video..
It would have been cool if you also mentioned the scene where the neck floating problem in the end etc.