THIS IS HUGE! Everyone can do High Quality Face Mocap now!
Вставка
- Опубліковано 22 чер 2023
- (advertising @hollyland) Check out the Hollyland Eco-system: www.hollyland.com - Learn how to use the new Metahuman Animator in Unreal Engine 5.2. Capture high detailed face motion data for your custom metahuman in this tutorial video.
Hollyland Products that we use
Solidcom C1 Pro ► www.hollyland.com/product/sol...
LARK M1 ► www.hollyland.com/product/lar...
LARK M1 ► www.hollyland.com/product/lar...
Cosmo C1 ► www.hollyland.com/product/cos...
Learn Unreal Engine 5 for Beginners
► www.cinecom.net/courses/unrea...
More Unreal Engine Tutorials
► • 5 Tricks you (probably...
Read More
► www.cinecom.net/unreal-engine...
👕 MERCH
► cinecom.net/merch
🎬 Check our Award Winning Courses
► cinecom.net/courses
💙 LETS CONNECT!
Instagram ► cinecom.info/Instagram
Discord ► / discord
💥 Download Unlimited Video Assets
► storyblocks.com/Cinecom
#Cinecom
Unreal Engine 5.2 brings some awesome new features! I know what to do this weekend! 😁 What are you weekend plans?
Making blender animations! 😂
glad to see you got it working
you dont know a ffing thing about it lol
Okay Sure@@peter486
"We don't need expensive gear"
Also him: "I'm using my iPhone right now"
🗿
+100 for the neat folder structure and detailed naming of assets! :D Great tutorial!
You guys are great, you show from start to finish for a tutorial at a followable pace.
how do i fix the floating head
would like to see a detailed tutorial from start to finish. From a blank project to final render. Not just face mocap but also body animations. It is often quite complex and confusing as to the number of steps and different interfaces required. Many of the “tutorials” do not explain properly and often skip steps or start with “ I have already set up the …”. Also, the accuracy of the lip sync is often off and letters like B and P are not done with closed lips etc. Also how to animate the eyes to be more focused on the camera when needed. I would ideally like a single tutorial instead of having to watch dozens of videos each doing something different and not always working as expected.
Tat's the whole point, don't you think it's strange that THEY ALL suffer from the same total arrogance and stupid jokes and half-info, plus sponsoring they're mates crap for you to buy and to get stuck, whilst they make dallas
have you found one yet? looking for the same
2 minutes of tutorial and 7 mins of sponsorship 😇
This is a game-changer for face mocap! 🙌
YESS CINECOM UPLOADS ALWAYS MAKE MY DAY!
just a note - your metahuman has a bowling ball heavy lower face because you scanned yourself with a big beard. the solver doesn't know the difference between hair and skin so it adds your beard as geometry, covers it in skin, and then you add your beard on top of that. You'd have a much better match if you went clean shaven for a scan-2-metahuman/animator creation, and then do your captures after that. From that period you should be able to use animator while having your beard and then apply the animation data to your non-bulbous metahuman, just as you would apply it to anyone other than you.
Foolish to cut a glorious beard for just some internet project lol
I have a massive mustache and metahuman thinks I have an overbite. Will have to shave it but it should grow back in one to two days.
My nba2k player always looked ridiculous because of that
@@44punk yeah, that’s how Freddie mercury got those big front chompers. He had that mustache when his face was created.
Dont forget, this is just the first step... in a few months, years this is gonna be even better.... amazing!
Hey Jordy, AKA MASTER ARTIST 😁 your Skilshare courses are awesome man . Funny and straight to the point 😃 Thank you so much.
Thanks for the great tutorial, but could anybody help, why animation doesn’t work when i track it to the face mesh in sequencer, after all the steps have been done? Cannot find any info how to solve it
Great tutorial man! This does skim over a lot of important details though. For example, what you showed here is only enough to animate the face and play it back, nothing else. Would've been nice if you talked about baking to control rig to do some tweaks or even how to animate the face with a performance (you're final export showed body and face). Also exporting the animation the way you did will only work on your specific metahuman since you specified that mesh. If you exported the metahuman skeleton instead, it could work on any metahuman.
This is correct, I am trying to figure out how to connect the body and the head together without the head floating away
Exactly ... instead of all the promoting there f4ggoty porn friends add ons and soft/hardware , give us info that's realistic and compleet ...spice girl with a beard
what about live lips and expressions in the Quest pro in Unreal / metahuman etc ?
Thanks for the quick funny workflow. ❤😂🎉
Thank you. This was really helpful
Thanks for uploading video 😊
So good, thanks!
So cool!
It was so unusual to see your neutral face since you are mostly very animated! Love your videos!!
I love the section where you explain how movie special effects were created.
and i have one quick question.
Is a vfx certificate required for employment in a company like MPC, SUNRISE WETA, etc?
Absolutely not required. You "just" need to be good at what you're doing. They don't care about a degree. You can learn everything you need through UA-cam, google, online courses and a lot of trial and error on your own. The one advantage a school might bring you is maybe contacts and connections into the industry. But generally I wouldn't recommend studying vfx, except it's at one of the top schools... and they are pretty expensive.
this is mind blowing, does it work with android as well as long as it has LIDAR?
What app do you use with an android? I have a Samnsung galaxy S23 ultra
Thanks for showing...I just wish it was even easier...so many menus to Open, so many options to choose wrong, so many steps to remember
I thought i was alone in this lost adventure 🤣🤣🤣🤣
Hey, I actually would greatly appreciate the help, can we create a metahuman with the live link video data, like when we don't already have a metahuman of the person in the video but want to create a metahuman with the help of the live link video..
What do you think about the possibility of using Apple's upcoming Vision Pro for virtual production? Could it make LED walls obsolete? Could actors and crew use this to work remotely?
I wonder if this would work with any other 3D model with enough blend shapes. This would be a really nice and easy way to animate not just the meta humans. BUTTTTT Jordie... im surprised you guys are only noticing this now?!?!?!? this live link has been a thing since UE4.24 or 26 but and meta humans since like beta 5.0 lol BUT you still made it simple and fast to ingest and start creating thank you :)
You dont need LiDAR, just true depth face Cam on iPhone
😁 oh yeah. depth cam cant map objects its just make depth of any object. you cant get movement with out lidar . then vfx films games just use depth cam for make full body movements not full body suit with trcking point for lidar
@@SggQpwpqpq-vq3ds knew that , was just related to metahuman animator and the mistake included in this particular video
TrueDepth camera is lidar bruh
Are there significant improvements for the later editions of phones?
Wow! I wanna know how to make a metahuman that is so like yourself!!! When I use MetaHuman animator or metahuman creator, the face is not like me so much.
I never thought I needed to see jordy with shredded abs and chest 😂
Love the videos!
Thank you, but how I did shape keys in fast way any program for that??
Unreal looks almost real now!
Happy Friday!
It would have been cool if you also mentioned the scene where the neck floating problem in the end etc.
Seems like Unreal engine's native support of Mocap is only available using iphone. (besides the professional equipment) am I right?
I saw some webcam version of face mocap software but seems not smooth.
Buying some used iphone is the only cheap way to use face mocap I guess
Hi can I use blender or unreal engine on ipad. if yes then which one is the best ipad to use it for rendering and editing in blender / unreal engine
Does this work on ipad pro with true depth sensor too?
nice tut.why did you mention that requires the lidar sensor when you havent use the lidar sensor in the video?
is automatic used by the iphone when he records on iphone with metahuman app....it gives all the datas and not just a normal video.
So fun!
Autor:
We don't need expensive gear.
Autor also:
I'm using my i-phone
you guys are the best
Nice ✨✨
How to fix the floating head?
my only question is
can you do a full body for the process of creating a characters movement
Do you have on for combining body and facial like you did here?
how do i fix my separated head?
Hi can this method work for Android phone.
Thank you
why cant i save this video on my playlist?
Crazy! Could this be used for lip sync and just an image of a face?
Does this only work on metahumans or can i use my own 3d model with this face mocap
ACTUAL GOLD
Here's me hoping that one day there will be a way to utilise these types of technologies, in a much simple way.
God you are good at what you are doing. Thanks
Wow. That is impressive .... Am Michael from Africa .. am new in visual effects.... So I need a mentor who will teaching me .... I know that we are from different places but I have faith that you will help me
*Is it anyway to do this live with your phone?*
Is there a way we can replicate using pixel 4xl ir camera?
How did you fiwx the floating head?
I cannot figure it out for the life of me.
is pasible only with iphone?
any chance of getting this to work on mac? metahuman plugin is windows only 😢
Like this)
Can I use a HMC?
so many steps. I think this can still be made more efficient
Ok but, how do i exactly do hand tracking?
Sorry, might be a Noob question... Does this app mocap data works with only metahuman characters? Or we can use it on our own characters created using maya/zbrush? Because everyone one is just mentioning metahuman characters only.
that was great nice bruh . love from indonesia
That's how you talk, you show a lot of great things, but I haven't seen a single video of yours with a normal explanation of how to get the desired effects.
Where can i test video to face?
is it possible to do in a android phone?
I want to know if you can create my avatar so that it can also work on zoom. And social media videos. And be able to easily modify your wardrobe. If so, I want to understand your values. And how to contact you. Thank you
Nightmare fuel!
Is this possible with Android? 😅
haven't you skipped the FIT TEETH part ? or is it alright to bypass that button
Wait, I thought that Iphone X or greater was all that was necessary for using Live Link?
Can you do it on android?
So, explain to me why someone would even need an iPhone if Unreal Engine is doing the actual facial capture with the green tracking marks? I would also LOVE a video from you on the Unreal Engine and Character Creator 4 from Reallusion's workflow for character animation. Both full body, facial etc, it might actually be superior to Unreal Engine, more universal, and easier to set up 🤔 but your opinion and a video would be awesome
My guess is that the iPhone app is bundling the RGB video data with the depth from the iPhone's camera (and possibly it's ARKit 3D model as well). If that's happening, then Unreal is likely matching it's analysis of the RGB video with the extra information the app gave, using both together to give a better result.
I'd love to see how this compares with just using RGB video, since I don't have an iPhone.
I have got problem, Time: 8:28 I don't see it in the drop-down tab. I cannot choose AS Metahuman Performance. And in the folder I have it.
Still waiting for the Body Motion Capture 😅
I clicked on the face and then clicked on the asset in animation mode. And I clicked on the animation sequence (performance), but the video does not play. What's the problem?
Same problem mate
There was a company called faceshift that Apple bought up. That software is inbeded in all iphones. Thats why that works. Its not unreal doing the heavy lifting.
Prior to that facechift worked with 50 USD kinect.
I thought only the pro max versions had lidar
I can't upload the live link files into UE5. When I try to drag it intto the content folder, it says error. Any idea whats up?
Please, I have a problem! after installing Metahuman plugin from the market, and when I start UE5.3 I cant found plugin in the list !!! any solution thank you
Wait a sec iPhone 11 doesn't have lidar sensor🤔
We don’t need expensive gear, just an iPhone 📲
😅😅 nice tutorial.. Thanks
I would go with an open lip when picking an open frame .
While playing my character, whatever sound I had given in my iphone live link face, I unable to hear sound.. you can timeline 8:30 ..?
Any iPhone 11 or up works?
👍🏻👍🏻
Lidar sensor on IPhone 11.????
*yup this world moves so fast*
I posted a question about Unreal Engine 5.2 and if it is possible to use imported animals like a fully rigged gorilla made with 3ds Max. Maybe I should not post a link here. Sorry. This could change everything as a filmmaker - will buy your course
Yes
@@liampugh could you answer more detailed please? I am totally new and interested in Unreal Engine. I do have a filmstudio with greenscreen and some good lights. Do you think it is realistic to set this up using the course? Is there a support in the course?
is this "live link" is on pc? using a phone is a little bit unprofessional and cheap.
did u ever made games?
So cool to see you make more UE5 stuff 🤩
Thank you for doing this video…. As we usually have to search every UE UA-cam video for this info. The problem is…. In one year, UE will probably change the interface and procedure, making the above video obsolete!
And just like that……. I need to watch again. Lol
I've got the problem, that the "MetaHuman Plugin" is unavailable for me. Anyone else with this problem?
Is this Unreal Engine guide or a promo video?
Promo video with a course commercial at the end lmao
What about samsung/android??
iphones have lidar, if you have an manufacturer that makes a lidar phone then will work. but this is not on android. this is on Samsung, Lg, Huawei, etc etc etc
Dude what you're talking about, this was possible with live link app on iphone years ago - but it was live and you had to record the whole thing and put it together in sequencer and bake an animation .... I made myself a lot of animations last year, same quality
live link face was available, yes. but the way the animation data is processed and the number of facial data points is far different. The use of the lidar is the key change, and the final product here is WAY better. I too animated lots of faces with the live link face and there is simply no comparison.