Thanks for working on a plugin which lets us do face animations for MetaHumans with and Android device or a Webcam! Though one question I have: In this video it looks like your plugin is only capable of using recorded videos for the face animation. Is this correct or is your plugin also able to do live capture?
@@manollobango I haven’t explored the live capture pipeline yet, but afaik it will initially require an Android or some app to communicate with the engine/editor plugin, so honestly I can't promise that live capture feature will be added very soon 🥲Let me beat apple's metahuman performances first then I’ll dive into it!😅
Excellent tutorial! The first of its kind that catches my attention. Great presentation format too, very clear and informative. Thanks for putting it together!
My preview depth frame it doesn't show me anything, just like you on 4:14. I use U 5.5 and the plugin 1.7. I cant see the rotation angle. What version use you: 1.0, 1.1 or 1.7 ?! Thanks for the trouble.
Can you capture a video from iphone with native depth, process that animation and then bring in the same video file as a file source and run this method you showed and generate depth frames that way and process the mocap to compare if the depth data is better from the iphone or from this? I would love to see the same acting processed both ways to see if this method is actually superior than the depth data that iphone generates. Also a test with higher fps recording, does that generate better motion capture because it has many more subtle in between frames to look at vs something basic like 30 or 60 fps?
Idk but for me, it does not have a great lip animation as the result especially in my language (Indonesia), that's why I keep recording with iPhone and cleanup the animation
Hi, thanks for being part of my community! If you've got any questions about the video, please let me know and I'll try to help out! 🙂
Wow🥺Watching this felt like I was seeing a completely different application!😅Amazing video! Huge thanks for showcasing my plugin🤗
Thank you so much for the amazing plugin! Metahuman Animator is finally available to the rest of the world without iPhones hahah. Massive thanks!
Thanks for working on a plugin which lets us do face animations for MetaHumans with and Android device or a Webcam!
Though one question I have: In this video it looks like your plugin is only capable of using recorded videos for the face animation. Is this correct or is your plugin also able to do live capture?
@@manollobango I'm glad you liked it ^^ and Yes that's correct! For now only recorded videos are supported to create metahuman performances 👍
@@xlipdev
Thanks for the quick reply!
Are you planning to add live capture?
@@manollobango I haven’t explored the live capture pipeline yet, but afaik it will initially require an Android or some app to communicate with the engine/editor plugin, so honestly I can't promise that live capture feature will be added very soon 🥲Let me beat apple's metahuman performances first then I’ll dive into it!😅
Excellent tutorial! The first of its kind that catches my attention. Great presentation format too, very clear and informative. Thanks for putting it together!
Thanks so much for your video tutorial, I wish you good health.
Thats very kind of you! Have a wonderful day! 🙂
My preview depth frame it doesn't show me anything, just like you on 4:14. I use U 5.5 and the plugin 1.7. I cant see the rotation angle. What version use you: 1.0, 1.1 or 1.7 ?! Thanks for the trouble.
Can you capture a video from iphone with native depth, process that animation and then bring in the same video file as a file source and run this method you showed and generate depth frames that way and process the mocap to compare if the depth data is better from the iphone or from this? I would love to see the same acting processed both ways to see if this method is actually superior than the depth data that iphone generates.
Also a test with higher fps recording, does that generate better motion capture because it has many more subtle in between frames to look at vs something basic like 30 or 60 fps?
Great video nonetheless, thank you!
I've had this thought too. I have an iPhone 12 I can test this with. :)
Thanks a lot, trying now...
Great! Let me know what you think! :)
Is there a way to change rhe body type, the model that is used for the metahuman ?
Of course, you can use this with any character that you want! :)
Does the plugin record time code from a Bluetooth source like the Live Link Face plugin?
The recording comes from whichever camera you prefer, so if your camera can write timecode to a clip, then you can use it! 🙂
Hi, cool video, do you need to install sdk metahuman?
Thank you! You need to download and enable the MetaHuman plugin from Unreal Engine marketplace/FAB. It's free.
Is there any plan to develop in lower version?
Probably not, I recommend trying 5.4. :)
Fantastic!
Glad you like it!
Omg that’s amazing
It really is! Thanks for watching! 😄
can you put it to fab market
It will soon be on Fab! 🙂
Does it work offline or is it just a bridge to some online service?
Good question! Actually haven't tried. I'll try offline and see what happens. 😃👏
Is there any free option for Android
This is for Android! And it's only 20 dollars, that about as much as a lunch at McDonalds. Totally worth it if you ask me.
introoooo
😄
Cool, but not equal to a simple Iphone 13 mini witch cost nothing today.
It's true. The iPhones are better. But this is $20 and works with any video, so hey. 😄
@ Yeah that’s cool and enought for gaming PNJ. I use it for movie render on my channel, so in my case, I need better not less. :)
You are too late. There is a Sound to Face animation on Unreal 5.5
There is! But that only creates mouth animation, not the angry/happy/sad expressions, eye blinks etc. :)
@@tiedtkeio It still doesn't seem to work well :(
I think it's better to use Sound to Anim and then manually add the emotions on an additional layer
It's just for mouth, no emotion added
@@tiedtkeio Thanks. I'm looking for something like quickmagic AI for face animation instead
Idk but for me, it does not have a great lip animation as the result especially in my language (Indonesia), that's why I keep recording with iPhone and cleanup the animation