Hello, I want to animate a characters face in an animation of mine. Does that work as well with live link? That you record the facial expressions on the camera which is in the sequencer?
You jumping on the ball with the new Metahuman stuff is A-tier. Everybody out here is confused, and there are almost no resources that get to the point like yours. Nice work.
@@AnimatorsJourney you realize no one and i mean no one in your audience understands anything you are saying, not the programs nothing, you need to do a step by step tuutorial from 0 not start way in the middle
Wow you didn't need the sample MetaHuman file, nor did you make any blueprint adjustments. I'm blown away by how simple you made this work using that iPhone LiveLink app connection. Amazing!
Great intro. They still have issues about the mouth closure, but there are workarounds out there. My question is... I'm not made of money and cannot afford a new iPhone or even the Ipad/pro. So I'm looking to buy a refurb or used iPad and can't find a list of older models that will work with LiveLink and do the face recognition. At first I assumed it was the Lidar (sp) camera, but further research tells me no? Anyway, if anyone knows of a list or specs I can use to determine which older models that work it would be greatly appreciated.
The same goes for Kinect, and facial motion capture using an IR sensor has an inherent problem. First, it can't handle fast pronunciation properly. Second, it is impossible to determine if the lips are closed in a state of lack of strength. Third, the lips must be exaggerated for proper capture. The way to complement the above three is OpenCV, but OpenCV cannot capture Z-depth.
Try loading up the MetaHuman sample project before adding your custom character. The one from the web store aka Unreal Marketplace. Also you can try using both IP's that show up on your network properties area. Hope these tips help. As it's the only way I can get it to work.
Thank you SO MUCH! Finally an ACTUAL tutorial... the Reallusion "tutorials" just showed you how to download some of the software and then skipped 10 steps. I got it up and running, but would you have any idea why my metahuman looks like half his face is numb?
Thank you for the tutorial! I ran into a problem though with my MH. Only half of my MH face is being animated and it looks odd (i.e. smiling would make the face make a weird expression and doesn't match up). Does anyone know what could be causing this?
Mines is just not linking at all what am I doing wrong?? This is the second video i follow till the linking then i just stop cause it's not linking with the iphone
I can get this all to work BUT, what do I do with it after? How do I render a video from this? Every single time I hit the record button it crashes. I'm using UE5.0 btw and I've no clue if that's even the process for recording my animations from Live Link. Also managed to get something done from within the BP but couldn't do a darn thing with that animation file either.
can livelink connect to UE running on macbook pro? i have livelink running on an iphone, added my mac IP, but UE does not recognize that livelink is sending data to it (the iphone name does not show in UE). Lucas, thanks for making the video.
How do you add an idle pose because I’m trying to make a video but with the body just stiff.. doesn’t look good but I don’t know how to add that animation.
hi, great tutorial. I'm using cc3 character in ue4 with live face, and my face mocap works however I'm trying to add my pre-made body animation to my character. I want this body animation to loop while I'm controlling the face mocap through live face. I'm bad with blueprints. any tips? thanks
Amazing sir, one question, is it possible to control the eye ball moment with other models? and then how can we record and export the animation to Blender?
Also, you should direct people to the settings that gives you an option to download 8k, 4k, or 2k. It seems the metahumans download at 8k res by default...which is fine if we're assuming everyone has terabytes of memory.
I don't understand the mouthClosed blendshape, normally I use a bone for the jaw / mouth open. So, is mouth close design to compensate for the jaw bone opening the mouth and how are the eyes controlled as I usually have those skinned to bones too.?
I’m working on PC and I have tried to link my phone with the PC to activate the live link but it didn’t work. How should I solve this problem? My PC has a wire connection and not a wireless connection
Hi i have question, is There eyeball movment with iphone live link, answer here please and do short demo if possible , many thanks. Im gona to buy iphone There is a much Money for me. I want to know abt eyeball movment.
Hello I followed your process and downloaded the character from MetaHumans but haven’t been able to export the character from the quiver bridge or connect my iPhone to the Live Link through my IP address please do yo have a fix for this
Hey, do you know why in UE5 facial livelink skeleton animation is not updating in editor? Tried literally everything, including the hack showed in this video thanks
@@V4NDLO no one ever responded. However, I think if you record the video ahead of time,, if you do eventually get the live link setup, you can face your phone or camera at the pre recorded video screen on a TV or computer and it'll pickup that face recording.
I've followed the instructions but for some reason my Metahuman is missing all face textures besides the eyes even after they've compiled. How do I fix this?
Excellent video! Just discovered this amazing software. still a total noobie. one question about Livelink - is it compatible with Android mobile devices as well as iPhone?
i want to be able to either assign keyboard keys to force expressions that live link isnt seeing? or is there a way to make UE4 more sensitive to live link so as to deliberately exagerate the expressions? any help would be appreciated and happy to pay for your time?
im having issues using UE with my iphone on Live Link. I used the correct IP.. UE sees it. but when i click play it says "facetracking not supported on this device" but Im using an iPhone X. Can anyone help?
Can anyone tell me why my iphone might not be showing up in UE 4.27.2 under the LLink Face Subj (iphone black)? It worked fine in UE5 other than the mocap animations being janky on one side of the face (this is a commonly known issue with this metahuman process in UE5, I'm just waiting for someone to fix the bugs). Since UE5 mocap was janky, I wanted to try it in UE4.27.2, but now my phone won't show up no matter what I do. I have restarted the live link iphone app several times, restarted unreal engine, restarted my computer, double and triple checked I am using the correct IPv4 address, switched my network from public to private and vice versa and I'm just at a total loss as to what I should do. Does anyone have any suggestions for either my issue with UE5 or UE4?
Great tutorial Luca. I just wondering why the quality of the capture in every tutorial i saw are not as good as the demo from Epic. I wonder what sort of device they use for mocap. I'm not convinced that creators could use that tool successfully with only a mobile phone. Thank you for the tutorial anyway. :)
One last thing...the Live Link app does not work on iphone 8 with the latest iOS 14.6. The error reads "Live Link Face requires a device with a TrueDepth camera for face tracking".
The mouth not closing was a bug in the latest version of Faceshift before Apple bought them for 200 million dollars. Seems to be they don't know how to fix it. Pretty lame!
@@AnimatorsJourney awesome. Following you on twiter now. :) I saw your other recent video on mh(metahuman) to maya then to UE. This will let me work on the facial animation via facial mocap as first pass then manually finesse it more in maya then export to unreal to render. Which eliminates the need to do higher quality livelink to unreal (current livelink to ubreal has pretty bad quality facial animation). do you have any courses on lipsync/facial animation workflow for maya?
Free Training for a career in 3D Animation: ebook.digitalcreatorschool.com/animatorsjourneyfreetraining
Hello, I want to animate a characters face in an animation of mine. Does that work as well with live link? That you record the facial expressions on the camera which is in the sequencer?
You jumping on the ball with the new Metahuman stuff is A-tier.
Everybody out here is confused, and there are almost no resources that get to the point like yours. Nice work.
Ha, thanks! I know the feeling of being frustrated for lack of clear instructions, that's what got me started creating courses. Cheers.
@@AnimatorsJourney you realize no one and i mean no one in your audience understands anything you are saying, not the programs nothing, you need to do a step by step tuutorial from 0 not start way in the middle
sooo true
Wow you didn't need the sample MetaHuman file, nor did you make any blueprint adjustments. I'm blown away by how simple you made this work using that iPhone LiveLink app connection. Amazing!
PLOT TWIST THE GUY IN the bottom right is a Metahuman !
Metahumanception
I just discovered Metahuman. I gotta try this.
I learned Maya thanks to you. It is good to see that you will help us with Unreal too. Amazing!
That makes me happy to hear I've helped :) Thanks for commenting!
Great intro. They still have issues about the mouth closure, but there are workarounds out there.
My question is... I'm not made of money and cannot afford a new iPhone or even the Ipad/pro.
So I'm looking to buy a refurb or used iPad and can't find a list of older models that will work with LiveLink and do the face recognition. At first I assumed it was the Lidar (sp) camera, but further research tells me no?
Anyway, if anyone knows of a list or specs I can use to determine which older models that work it would be greatly appreciated.
Bro. This is awesome! You just saved me probably a week of frustration. Cheers.
You're welcome!
What phone version u using ?
and u know if u can use android as well if u wanted to use that instead
The same goes for Kinect, and facial motion capture using an IR sensor has an inherent problem.
First, it can't handle fast pronunciation properly.
Second, it is impossible to determine if the lips are closed in a state of lack of strength.
Third, the lips must be exaggerated for proper capture.
The way to complement the above three is OpenCV, but OpenCV cannot capture Z-depth.
hi! i tried with my iphone 12 pro, add my ip but it doesnt find me in UE, u know how i can find my iphone there? thanks
Try loading up the MetaHuman sample project before adding your custom character. The one from the web store aka Unreal Marketplace. Also you can try using both IP's that show up on your network properties area. Hope these tips help. As it's the only way I can get it to work.
You managed to get the best lip sync out of metahuman yet!
Thank you SO MUCH! Finally an ACTUAL tutorial... the Reallusion "tutorials" just showed you how to download some of the software and then skipped 10 steps. I got it up and running, but would you have any idea why my metahuman looks like half his face is numb?
Thank you for the tutorial! I ran into a problem though with my MH. Only half of my MH face is being animated and it looks odd (i.e. smiling would make the face make a weird expression and doesn't match up). Does anyone know what could be causing this?
When i speak it doesn't open mouth fully
wooooaaaaaahhhhhhhhhhhh this just blew me away !
When I export it doesn’t have the skin or textures
I assume you would then use the Take Recorder to capture a performance that you could then add as an animation track in the Sequencer?
Mines is just not linking at all what am I doing wrong?? This is the second video i follow till the linking then i just stop cause it's not linking with the iphone
I can get this all to work BUT, what do I do with it after?
How do I render a video from this?
Every single time I hit the record button it crashes. I'm using UE5.0 btw and I've no clue if that's even the process for recording my animations from Live Link.
Also managed to get something done from within the BP but couldn't do a darn thing with that animation file either.
Wonderful!
But when I try to do it, the movements are very slow and sometimes clogged.
Keep it up, I'm exactly where you are, can you make a beginner tutorial for unreal especially for cinematic?
Thanks for the suggestion, I'm working on something for that now, might be a few months. Stay tuned :)
When I set up the LiveLink ip thing, it wouldn't show up or work at all for me. Any help?
Thanks! Does this app work with android also?
can livelink connect to UE running on macbook pro? i have livelink running on an iphone, added my mac IP, but UE does not recognize that livelink is sending data to it (the iphone name does not show in UE). Lucas, thanks for making the video.
Hi, I’m having an issue where my live link head separates from body? Any help would be appreciated
Thanks for creating this super excited to jump in!
How do you add an idle pose because I’m trying to make a video but with the body just stiff.. doesn’t look good but I don’t know how to add that animation.
i would like to know from which iphone this is compatible? would an iphone 6 or 7 or 8 already be enough to operate this app?
hi, great tutorial.
I'm using cc3 character in ue4 with live face, and my face mocap works however I'm trying to add my pre-made body animation to my character. I want this body animation to loop while I'm controlling the face mocap through live face. I'm bad with blueprints. any tips? thanks
Hey. Great video, wich Iphone u used? Wha you think about Iphone 12
Amazing sir, one question, is it possible to control the eye ball moment with other models? and then how can we record and export the animation to Blender?
Also, you should direct people to the settings that gives you an option to download 8k, 4k, or 2k. It seems the metahumans download at 8k res by default...which is fine if we're assuming everyone has terabytes of memory.
Iphone 12 pro max can connect with live link???
is there a way to do this without an Iphone but a webcam instead ? i dont want to buy an iPhone just so i can do this. please help.
is it possible to connect with USB Cable Iphone to pc?
Is a webcam used?
I don't understand the mouthClosed blendshape, normally I use a bone for the jaw / mouth open. So, is mouth close design to compensate for the jaw bone opening the mouth and how are the eyes controlled as I usually have those skinned to bones too.?
everytime i try to export from bridge. it says unreal isn't open. even tho it is open
I’m working on PC and I have tried to link my phone with the PC to activate the live link but it didn’t work.
How should I solve this problem?
My PC has a wire connection and not a wireless connection
min will connect. I even tried the trick of turning the sequence off and on and setting it to none. What am I doing wrong?
will not connect
the head is static, it looks weird, the movement of the head is not enough, how to do it?
setting in the app to turn on head rotation tracking
@@AnimatorsJourney Thanks!
This is awesome, I am surely gonna try today, was wondering will my iPhone 8Plus will work?
iPhone 10 is probably the lowest you need as it has the depth sensor front facing camera.
Can Live link work on a dedicated server, or only on a package?
what is your iPhone model? please
Hi great tutorial. I have a question when I try to record the sequence my UE crash. Anyone have this same issue?
Yes, i tried it like 100 times.
Do you got a Solution?
Wow this is amazing. Can live link also track the body movement as well?
Only head rotation for now (in addition to the face)
@@AnimatorsJourney Thank you
@@AnimatorsJourney How ?
only works with iphone?????
Hi i have question, is There eyeball movment with iphone live link, answer here please and do short demo if possible , many thanks. Im gona to buy iphone There is a much Money for me. I want to know abt eyeball movment.
Hi, do you have a video on using Live Link on Paragon Characters? Or can Paragon Characters use Live Link?
Nice tutorial!
Thank you! Cheers!
How to enable real time head and neck rotation mocap ? Is it possible at all?
Thanks for this!
I wonder if this can be used with a non-human? Or a highly stylized one?
Hello I followed your process and downloaded the character from MetaHumans but haven’t been able to export the character from the quiver bridge or connect my iPhone to the Live Link through my IP address please do yo have a fix for this
Awesome, that was very helpful THANKS! Had the problem to see the live tracking in the viewport. That works fantastic now.
Glad it helped! Thanks for watching :)
Can this only be used with an iphone?
Thanks! Do you know how to record neck/head movement as well?
Hey, do you know why in UE5 facial livelink skeleton animation is not updating in editor? Tried literally everything, including the hack showed in this video
thanks
Can you explain how to add head rotation
Is it possible to record in the live link app and then connect to Unreal Engine later or does it have to be a live connect setup?
I have the same question. Did you ever find out?
@@V4NDLO no one ever responded. However, I think if you record the video ahead of time,, if you do eventually get the live link setup, you can face your phone or camera at the pre recorded video screen on a TV or computer and it'll pickup that face recording.
Perdona, ¿Pero se puede para android?
can I use it on iphone se 2016? or need new iphone?
I've followed the instructions but for some reason my Metahuman is missing all face textures besides the eyes even after they've compiled. How do I fix this?
Thank you for sharing that. I managed got it workin. Any idea what I can use to track head movement?
This does it as well, it's just an option to turn on in the app settings
@@omgee8968 u need a tracker on ur head
Does anyone know why literally every metahuman skeletal mesh I use doesn't utilize the right side of their face?
nice video!
btw, why does my live link get a warning : 30.0 fps message? it's an iPhone X :X
What Iphone do you have? X?
what iphone do you have?
the face animation can work , but the last step, can not link to the animation~ e
THANK YOU SO MUCH!
How does it cope with prescription glasses?
How much is it?
why do I not get the subject even though I write my proper IP?
Will this work with a ryzen 3700X, 32GB DDR4 and a GTX 1070?
How would you lip-synced animate a pre-recorded audio? Would you do that in Maya?
That's how I would do it personally, in Maya. I've got another video that shows how.
Great tute. But for some reason the textures aren't coming in. Any ideas on why?
They're not visible in UE? Is it just a matter of waiting for it to 'compile shaders'?
@@AnimatorsJourney The shaders eventually compiled. Success!
Can we make live in tiktok or in Facebook using this method with being meta human
Excellent video! Just discovered this amazing software. still a total noobie. one question about Livelink - is it compatible with Android mobile devices as well as iPhone?
only iPhone
What has to be the format of the model if I want to totally build one from scratch for my project?
i want to be able to either assign keyboard keys to force expressions that live link isnt seeing? or is there a way to make UE4 more sensitive to live link so as to deliberately exagerate the expressions? any help would be appreciated and happy to pay for your time?
Sir, how we can create live link face in Autodesk Maya???
Purchase Rokoko face capture license - live link is specific to unreal.
im having issues using UE with my iphone on Live Link. I used the correct IP.. UE sees it. but when i click play it says "facetracking not supported on this device" but Im using an iPhone X. Can anyone help?
Can anyone tell me why my iphone might not be showing up in UE 4.27.2 under the LLink Face Subj (iphone black)? It worked fine in UE5 other than the mocap animations being janky on one side of the face (this is a commonly known issue with this metahuman process in UE5, I'm just waiting for someone to fix the bugs). Since UE5 mocap was janky, I wanted to try it in UE4.27.2, but now my phone won't show up no matter what I do. I have restarted the live link iphone app several times, restarted unreal engine, restarted my computer, double and triple checked I am using the correct IPv4 address, switched my network from public to private and vice versa and I'm just at a total loss as to what I should do. Does anyone have any suggestions for either my issue with UE5 or UE4?
Great tutorial Luca. I just wondering why the quality of the capture in every tutorial i saw are not as good as the demo from Epic. I wonder what sort of device they use for mocap. I'm not convinced that creators could use that tool successfully with only a mobile phone. Thank you for the tutorial anyway. :)
can u move the head/neck with it?
can it also be done with an aribook?
Use this for VR and make it your avatar so we can be ourselves in VR
and i've seen others have whole body movement as well with the facial animations, have any ideea how?
They have a mocap suit. Perception Neuron and Xsens are two companies that make them.
One last thing...the Live Link app does not work on iphone 8 with the latest iOS 14.6. The error reads "Live Link Face requires a device with a TrueDepth camera for face tracking".
Yeah, the app utilises ARKit which in turn requires the right hardware. iPhone X and above works.
Thanks yo so much!!
I don't know how to use unreal or metahuman, but this is pretty cool
Hello I have a question. Which iphone model do I need to run LiveLink? Are there any special requirements(sensor?)
12pro or max, 13 pro or max, same for 14
Please make the MetaHuman Live Link Kit available? I can't buy it! I even managed to find some files I didn't find the rl_funciontion_lib file
I don't know what you're referring to, I don't sell that.
The mouth not closing was a bug in the latest version of Faceshift before Apple bought them for 200 million dollars. Seems to be they don't know how to fix it. Pretty lame!
Is live link is free ?
Is anyone knows why only neck moving and the face not reacting at all?
same problem :(
thanks for sharing. I wonder if we have access to the arkit blendshapes that we can tweak inside unreal
This may be helpful: twitter.com/epicchris/status/1385422520606679042?s=20
@@AnimatorsJourney awesome. Following you on twiter now. :) I saw your other recent video on mh(metahuman) to maya then to UE. This will let me work on the facial animation via facial mocap as first pass then manually finesse it more in maya then export to unreal to render. Which eliminates the need to do higher quality livelink to unreal (current livelink to ubreal has pretty bad quality facial animation).
do you have any courses on lipsync/facial animation workflow for maya?
recording doesnt work.
LIVE LINK IS JUST FOR IOS?