Metahuman Animator Tutorial
Вставка
- Опубліковано 29 тра 2024
- Metahuman Animator Tutorial
JSFILZM Mocap Helmet: • Cheap Mocap Helmet for...
Mocap Waitlist: JSFILMZMOCAP AT GMAIL DOT COM
Grab my new Unreal Engine 5.1 Course here! Be sure to share it with everyone!
Link to lighting course: www.artstation.com/a/25961360
Link to How to make a movie in UE5.1 www.artstation.com/a/22299532
jsfilmz.gumroad.com/l/lmaqam
My Realistic Warehouse VR Demo: www.artstation.com/a/27325570
My Fortnite Map: 3705-9661-2941
Join this channel if you want to support it!
/ @jsfilmz
Sign up with Artlist and get two extra months free when using my link below.
Artlist
artlist.io/artlist-70446/?art...
Artgrid
artgrid.io/Artgrid-114820/?ar...
@UnrealEngine #unrealengine5 #metahumananimator #metahuman
Metahuman Animator Tutorial,unreal engine metahuman animator tutorial,How to Use MetaHuman Animator in Unreal Engine,unreal engine 5.2 metahuman animator,unreal engine metahuman animator app,unreal engine 5 metahuman animator tutorial,metahuman animator tutorial,metahuman animator release date,unreal engine 5 metahuman animator release date,metahuman animator unreal engine 5,metahuman animator unreal engine,unreal engine 5.2 metahuman animator tutorial,jsfilmz - Фільми й анімація
ITS ON LINKEY DONKEYYYYY! Here is video of the animator app studio.ua-cam.com/users/videoiXH79mrKADM/edit
First rap test ua-cam.com/video/mAT1X4xbEdA/v-deo.html
Do you need the depth capture in Live Link or can you just use video?
Bruh! It is nuts how accurate this thing is.
Yes so glad to hear you’ll be doing more tests with this. Bro just realised I’ve been on this unreal journey with you since your channel was a baby. You’re literally the don in this field. Keep em coming
Incredible! You're so ON IT. Thanks for the walk-thru. Very helpful!!!
Thanks for the tutorial, I can't wait to try it, although it will take hours since the shorts that I'm doing have a lot of dialogue, but it will look a lot more realistic. Best
BIG! I was wanting to explore this, thanks for the tutorials!
Knew this was coming the second I saw the Unreal post about this. Excited to try this out today.
Bro, you are a master!
And the way that Unreal it's improving this technology day by day it's amazing.
thx dude!
You're amazing. Thank you for staying on the pulse!
thanks for being here!
Thank you so much, its amazing
7:00 I'm listening to this video driving around, windows down, level 20 on Bluetooth and at a Bus Stop with little old ladies 🤦♂️
lol
just done my first quick test following your tuts to guide me through the process, got to say wowzer animator is amazing. so much cleaner then i expected. well worth the iphone rental :}. keep the vids coming :}
oof someones bout to buy an iphone 😂
Great job!
Good and fast tutorial, nice man
tx
You the man bro
Some day i think you will make tutorials before release. haha. So quick!!!!!!!!!!!
hahaha yea they didnt select me for beta access unfortunately i know some did so im gonna have to catch up to them
Awesome.. Randomly woke up at 4am knew there was a reason... thank you!
hahaha
Awesome video! Thanks for the quick setup explanation! But I have to point out that MetaHuman Animator already uses AI for its solves.
When you hit the "Prepare for performance" button, it trains a model on your face to later mimic the way it moves so it can animate other metahuman characters to that likeness. Thats why this step took 8-10 minutes : )
LUV'd your last vid with your little girl (all of your vids ❤ ) 🏆😁👍!
hahah thx man it got 20k views on twitter hahah
Great stuff!
Thanks!
Thanks Lord Helmet!!!
sooo quick , Dude ! you are fast as hell .... 😁
i woke up 5 am today haha
The accuracy looks insane
yea
Thank you for sharing a fun and essential tutorial!! Anyway, is there a way to use the neck animation recorded by Unreal Live link facial capture? When I imported the facial anim with neck animation, and apply to my metahuman skeleton, the body and face is broken because of the neck anim. It's also possible to just facial anim without neck movement, is there a way to use neck anim...? Is there just one solution using the mocap data (with neck anim) + facial capture anim just face movement(without neck anim)...?
This is awesome! Would I be able to take the metahuman/face mocap and export it into blender? I don't need any of the textures or materials. Thanks!
Great showcase! In case you plan to make more test video's, can you show expressions that are hard to do with Apple Arkit? I'm curious how it compares.
For example a sad face with hanging lower lip, asymmetric brow movement🤨, worried face 😟 or any interaction between teeth, lips and tongue.
man im a terrible actor but ill try
@@Jsfilmz Thanks a lot!
Thank you so much for your tutorial, so timely. In addition, I am trying to import from the mesh body in UE5.2, and it seems that the tracking mark link can no longer be carried out. Have you encountered it?
Have u done any vids how to take that head with animation and apply to another metahuman ?
Hey! I just wanted to say thank you for your videos! If it wasn't for you I couldn't have created animation for my metahumans. ❤ Thank you
That is awesome!
Omg yes! My video will never make it if it wasn't for you! I even mentioned you in appreciation in the description ❤️
@@HQLNOHthanks man not manu give credit back to me i appreciate it
Thanks for the tutorial! Do you have a recorded take file I can use as a test? I dont have an iphone. Thanks
I Hope that behind the scenes they are working on full body mocap
This was jammed packed with so much information. I have to pause this video and slowly follow all of the steps. This is amazing! I'd like to purchase the headset for my iPhone. Did you make it yourself? or did you get it from a website?
home made bro with my sweat and blood its not $129 anymore though its $169 now
Amazing content as always!
Could you please make a video on troubleshooting these three issues:
- “Promote Frame” randomly jumping to a different frame than the selected one.
- Metahuman Identity Solve not accurate result.
- “Add Teeth Pose” breaking the Identity Solve even more.
Thanks a lot!!!!!
A lost detail in Hideo Kojima's DS2 trailer, was the end saying the performance-capture is powered by Metahuman.
That game is going to be insane levels of detail.
yea hideo was lurking around on my channel when ue5 first came out i was one of the first ones to cover it
So, whenever I want to make a facial animation (of myself doing it with live link) I need to go through these steps? But they can be used on every metahuman?
You're absolutely smashing it man 👏🏽 Thank you so much for the awesome content!
I do have one slight issue though!..
For the life of me, I cannot get the Livelink Face mocap to work with separate body mocap. The head/chest just detaches itself and they are both independent. It's driving me insane. I've tried following some advice on the UE forums to no avail 😭
Have you experienced this yet? any tips?
Thank you
Great video man! I'm getting an error when I hit the process button in the Metahuman Performance. It says "The Processing Pipeline failed with an error." Any ideas on how to fix this would be appreciated. Thanks!
This really is such a big step forward. Very exciting. The only problem is the iPhone only requirement. That makes no sense for something like this, but let's hope it changes sooner or later.
beats buying a real mocap system 😂
@@Jsfilmz True I suppose :)
I need to buy iPhone 12 + first hahaha
looks like 11 works
@@Jsfilmz In their docs they group the X, 11 and 12 together and then 13 and 14 together. Then they give a caveat about the X, saying that its not capable of capturing more than a few sec, but they dont say that about the 11, so yeah, I think the 11 is capable of capturing longer form content like the 12.
@@aaagaming2023I tested the iPhone 11 Pro last night and it works fine, zero issues.
does someone got it working good with iphone 11? the new epic post says it needs aleast iphone 12..
try it broski
...and off we go!
when doing facial mocap, what do you do for a mic? do you have a little one that you attach to your helmet? what kind of mic is goof for that?
just my good ole senheisser g2
@@Jsfilmz do you attach it to the arm of your helmet?
I wonder how long a given take can be. I have to make some EDU products with this, and they might need to be on the long-ish side.
can i move the mocap data to another software, like blender?
Do you think there will be a marketplace to buy mesh to meta human data scans eventually? For example I don't know any korean girls to run this on with an iphone like that company did but i'd be willing to pay people who know attractive people from every race if they did
Did you see issues of floating head? Face animates and body remains still.. if neck rotation is disabled it's fixed but the neck rotation part of the capture is lost.. Do you know how to avoid that? Thanks! 🙂🙏🏻
is there a way to create the captures from a webcam?
crazy times!
4:43 "Iden-TITTIES" hahahahahaha
Great video man! Thank you for the content.
Hi! It's amazing video. I try to import my recording video using iPhone 11 but, in the capture manager the video can be reading. Any suggest for it?
firewall?
Hope my XS works. Don't want any more Apple pradux
Please try it & please report if it works, I have same iPhone XS.
I am not gonna have access to windows system for at least 15 days, but I am dying to test this feature.
If I'm not wrong, you should be turning off the neck solving on a headmounted camera. And use it only for static cameras.
in my case ill use neck movements from my mocap the exported sequence doesnt come with neck check out new rap video i uploaded
I get the "assertion failed" crash every time I promote the first frame... what should I do?
Do you think it's worth shelling out a bit more money for the iPhone mini 13 over 12?
im broke so ur askin wrong person
Incredible! Is there a way to export Animation data to Maya to have more freedom for further tweaks?
that would be amazing but i dont think thats possible yet
This guy made a script to do it ua-cam.com/video/ecYO5-5fL0U/v-deo.html
Hi, i asked this to a lot person but couldn't get appropriate answer. After creating identity and perform with it, my Metahuman Character's default lips and teeth are changing, it's because of getting my identity. Is it possible to keep animation exactly same but with character's default facial features?
I may be a little late to the party, but im confused why you choose in CAPTURE SOURCE LivelLink Archives (which is uploading footage etc from PC drive?) then when you go to import you do it from the iPhone in CAPTURE MANAGER?
i made tutorial both ways my usb transfers faster for bigger files
@@Jsfilmz oh sweet as no worries bro. Any tips on uploading from USB then using archives? Seems more complicated to setup the Performance that way...
Having a huge problem when I try to add the animation to my metahuman in the sequencer it becomes detached from the body from around the shoulders area. Was it because I may have moved too much in the capture? Can’t seem to get the body and the head attached
I don't have an iPhone. Will it work with my 2021 iPad pro ?
Did everything to the t and my animation sequence doesn't show up in the Face animation menu. Any tips?
What about body movement and hands?
Hello bro.I just went back to this video ,becasue for me when update to 5.2.1 its says Preparation for Performance Fialed,any tips on it bro.
BOSS!
it´s here!!!!! downloading plugin!!!!!
hey bro! awesome!
I have a little issue, when I track my face with the animation sequence it's like nothing happend in my sequencer, but it exported correctly because when I open it alone the animation is fine. any idea?
Thank you!
i dont understand :( maybe join unreal discord and post pics and issue there
JS can that be done with DAZ characters too?
Hi! Thanks for the cool video. Listen to the question, will it work if you upload a regular video shot on a camera?) just no iphone))
dont think so iphones are precalibrated like i showed in the performance editor
No, the software requires depth information which a regular camera can’t provide. Borrow an iPhone 11 or newer.
when I play the level in a new window (PIE)
the Metahuman character is moving his face with my motions very good, but when I click in any window other than the PIE window it starts to lag. and when I click again on the PIE window, the character moves normally !
what should I do ?
epic insane , looks much better then faceware or livelink .like you say the curves look smooth and no jitter . and to think i spent 4 hours last night doing 30 sec of manual facial animation that looks rubbish. so if i get a friend who has iphone they can send me clips ? .
yes
@@Jsfilmz awesome ,, is there any reason not to get the iphone 12 mini ?
@@kool-movies i havent tested longer takes with 12 mini yet but it used to overheat on me alot hahaha
@@Jsfilmz your rap video is a long take so hopefully will be good , i just ordered/renting a cheap refurbished one. hopefully arrives tomorrow. :} the 3lateral video that just released is mind blowing.assume they used a stereo camera ?,
add get hangs when I try to bake the animation....???
Super sick! Do you feel like the results are significantly better than the live stream app?
bro go watch my rap with it it will answer your question
after ubdate iphone app It gives a warning for the animation part of the application live link face app saying your device model is unsupported you may continue but your results could be affcted"for meta human animator capture ." and its work but i wonder My economic situation is not good and I will develop games. How much performance difference does the iPhone 11 make?
The software requires an iPhone with a “True Depth” (LiDAR) sensor because it needs depth data to accurately track your face. You can always borrow an iPhone 11 (or newer) to do the tracking and then transfer the file to your computer.
thank you for your answer, what are they doing this warning for, do you think there will be a noticeable quality difference?
@@UnrealEnginecode The quality I got was excellent so I'm not worried about it.
my head keeps detaching from the body when i attach the animation to the face and play it in sequencer
is pasible only with iphone?
Also hey I'd like one of those helmets, but I do have an iphone 11, will that work? It's bigger than the mini, of course.
i know some people whos tried 11 with MHA and they said it works i havent tested it myself
Gonna buy an iphone 12 mini like you have, can you tell me which mount you use so I can buy that and yr helmet? @@Jsfilmz
So how do we connect this to a body so we can add animations to it? 😅
how to add my facial animation to different character
Great vid!! In the Showcase ,didnt they show a way you could use this app to generate textures for your metahuman? Will you be showing us how as well?
can you send me that video? i dont think i saw that
@@Jsfilmz I think I may be mistaken. I thought the HellBlade II showcase did it ,but I think they may have had a premade metahuman.
@@HellMunky What's your workflow? What do you use to generate the mesh/textures that you import into UE to use in the plug-in?
Great video - you mentioned about when MHA will later use AI. Just a heads up it is AI driven currently. Pixel tracking is only a small part of the foundation.
Great video!
wait like its doing ai pose estimations already?
@@Jsfilmz yeah there’s a lot under the hood that’s ML driven already. Facial tracking doesn’t account for wrinkles or much else other than eye and mouth shapes, everything else is interpolated with AI. there’s more coming but the foundation is already utilizing AI in a lot of areas.The second pass animation is still being improved on, so it’ll continue to get better. But it’s a training model based on a lot of human facial animation, to know what to do when cheeks are raised, nostrils flared, eyebrow wrinkles etc
Night and day different to something like live face or other tools which purely track eye and mouth shapes and don’t leverage any AI to them interpolate wrinkles and pseudo face muscles into the animation
@@AllanMcKayoh wow hahaha crazy stuff being 1.0 its not bad oh btw for iphone it can only output 30 even when recording 60 right? Thanks i love knowing about the tech
Hey J, for some reason the result of your test is not very good, compared to other tests I have seen online. Do you reckon there was something in the configuration/shooting conditions that interfeered? Or perhaps the other tests used other types of cameras, such as stereo cams, rather than iPhone? Thanks for the tut though.
i think the demo videos that came out were done with stereo cams not sure
@@Jsfilmz The rig used when they announced MHA wasn't a metahuman rig, in fact they only showcased how the new system works on metahuman rigs for like, 5" at the end of the presentation.
@@matteo.grossi hahaha thats cheating then right lol
Can you do a video of how to warp a metahuman to look like a custom character? RS3D Zwrap is a good wrapper. I have this model of NAS I want to put to the test, as well as Michael Jordan and the Rock.
just mesh to metahuman it mang save u the headache hee hee
@@Jsfilmz Hmmm, I guess I'm a bit behind. Not sure what that is or where it is. I'll look it up. Thanks
@@SkyHandOneTenoh yea man mesh to metahuman jsfilmz look it up its easy
Seems that info is everywhere I was just looking up the wrong terminology all this time. Thanks.
The Performance audio track shows 'Unresolved Binding' though I can hear the audio. When I export the animation no sound is exported......the internet has failed me. Anyone? UE 5.3 and 5.4
Epic is great but they gotta give us some options with the markers, like maybe don't make the ones that go on teeth straight up yellow / green lol? Great video though.
So strange, in Capture source whatever I put( live link or archive) it doesn't find it. It gets green in Capture manager but nothing shows up🤔
hey man im uploading another tutorial stand by maybe u missed something
@@Jsfilmz great, thank you so much
unreal doesn't see my livelink face recorded file, any help
can i use android video record of my face?
can it be used not with my face but with the one created in metahuman creator? and how to do it?
stay tuned
Were you able to figure out how to connect the iPhone to wirelessly be triggered to record from Unreal?
just the regular livelink way?
@@Jsfilmz With the new MetaHuman Animator version of LiveLinkFace, can't seem to connect. Trying to match my body mocap with the face mocap.
cant do it live animator is offline
@@Jsfilmz I wonder if I can I can use OSC to sync up the recording process on the phone with the mocap
His next epic: "Get Shorty"
I have a problem that the character shows more bottom teeth than top. I dont speak like that andclive link doesnt do it.
I tried re-tracking face markers but still had the same problem.
Could you provide me with the information about your iPhone 12 model so that I can search for a similar one to buy?
12 mini broski
@@Jsfilmz Thanks!
Do you have to record the videos with the live link app?
Because for every android user this would be a real pain in the *ss and make this tool completely unusable and useless for non apple-device owners.
I tried using a normal mp4 video but it said that there is no footage in the folder.
tell android to get it together i havr android main phone
I've done it but how can I render it... help me pls
is there a way to export the head movement? it looks unnatural to have just the facial expression with a still head.
yes watch my tripod tutorial i uploaded today
@@Jsfilmz it looks like that if you use a Metahuman that you already had before the latest update, it won't export the head movement, but if you download a Metahuman now, it will. I guess they have updated all metahumans in the Bridge catalog. I still don't know how to attach it to the body.
@@Jsfilmz I watched it but you are not showing how to attach the body, you just say "I will import a body animation"
Wow. I have the same issue. How to attach the body with proper movement...? I think there is the way to solve it with BP... I checked the Unreal forum but I can't find the proper way.
yeeeeaaaaaaa 🔥
hey J can i borrow 10k subs so i can hit 100k? thanks mang
On the metahuman video they use 4 calibration images, is there a reason why you did not use side views?
i have helmet on bro hahahaha they on a tripod if i move my head the camera wil move too
@@Jsfilmz You could make the calibration video with a tripod and use the helmet for the capture. I don’t know if it will improve the calibration, the metahuman seems a little off especially the nose.
Jo do you have any clue why my metahuman i created in creator does not show up in bridge? Its so frustrating man i spend over a hour now trying to get it to work
I am mad as fu!& now😅
update bridge
Is it possible to record facial animation in 5.2, and then retarget it to 5.1 character?
cant go backwards far as i know
Hello, awesome tutorial, but I have a very basic problem, capture manager does not recognize the files, literally the step you make at 1:20 of your video shows no video files for me. I recorded with Iphone 13 mini (supports LiveLink, also realtime works), and I am working on Windows PC. Any ideas what can be the problem?
firewall?
@@Jsfilmz Im not so sure about that, because real-time connection works, just that imported video is not recognized at all. I must mention that I am using Windows PC, so might there be a problem while transferring files? I just imported from OneDrive the whole zipped folder and un unpacked, but is there maybe another way?
Idk what's going on its says that's its no longer available 😕 I been trying to this for weeks now