Live Link Face Tutorial with New Metahumans in Unreal Engine 4

Поділитися
Вставка
  • Опубліковано 28 лис 2024

КОМЕНТАРІ • 201

  • @AnimatorsJourney
    @AnimatorsJourney  2 роки тому

    Free Training for a career in 3D Animation: ebook.digitalcreatorschool.com/animatorsjourneyfreetraining

    • @agentwhiteblack5869
      @agentwhiteblack5869 2 роки тому

      Hello, I want to animate a characters face in an animation of mine. Does that work as well with live link? That you record the facial expressions on the camera which is in the sequencer?

  • @EagerSleeper
    @EagerSleeper 3 роки тому +41

    You jumping on the ball with the new Metahuman stuff is A-tier.
    Everybody out here is confused, and there are almost no resources that get to the point like yours. Nice work.

    • @AnimatorsJourney
      @AnimatorsJourney  3 роки тому +8

      Ha, thanks! I know the feeling of being frustrated for lack of clear instructions, that's what got me started creating courses. Cheers.

    • @ChasingLatitudes
      @ChasingLatitudes 2 роки тому +1

      @@AnimatorsJourney you realize no one and i mean no one in your audience understands anything you are saying, not the programs nothing, you need to do a step by step tuutorial from 0 not start way in the middle

    • @mithunkrishna3567
      @mithunkrishna3567 2 роки тому

      sooo true

  • @virtual_intel
    @virtual_intel 3 роки тому +7

    Wow you didn't need the sample MetaHuman file, nor did you make any blueprint adjustments. I'm blown away by how simple you made this work using that iPhone LiveLink app connection. Amazing!

  • @3dhotshot
    @3dhotshot 3 роки тому +33

    PLOT TWIST THE GUY IN the bottom right is a Metahuman !

  • @davidkelly4210
    @davidkelly4210 2 роки тому +2

    I just discovered Metahuman. I gotta try this.

  • @marcelogonzalez9565
    @marcelogonzalez9565 3 роки тому +8

    I learned Maya thanks to you. It is good to see that you will help us with Unreal too. Amazing!

    • @AnimatorsJourney
      @AnimatorsJourney  3 роки тому +3

      That makes me happy to hear I've helped :) Thanks for commenting!

  • @Art_911
    @Art_911 3 роки тому +3

    Great intro. They still have issues about the mouth closure, but there are workarounds out there.
    My question is... I'm not made of money and cannot afford a new iPhone or even the Ipad/pro.
    So I'm looking to buy a refurb or used iPad and can't find a list of older models that will work with LiveLink and do the face recognition. At first I assumed it was the Lidar (sp) camera, but further research tells me no?
    Anyway, if anyone knows of a list or specs I can use to determine which older models that work it would be greatly appreciated.

  • @codydillon7428
    @codydillon7428 2 роки тому +1

    Bro. This is awesome! You just saved me probably a week of frustration. Cheers.

  • @swiftdetecting
    @swiftdetecting 3 роки тому +2

    What phone version u using ?
    and u know if u can use android as well if u wanted to use that instead

  • @porororo9056
    @porororo9056 3 роки тому +1

    The same goes for Kinect, and facial motion capture using an IR sensor has an inherent problem.
    First, it can't handle fast pronunciation properly.
    Second, it is impossible to determine if the lips are closed in a state of lack of strength.
    Third, the lips must be exaggerated for proper capture.
    The way to complement the above three is OpenCV, but OpenCV cannot capture Z-depth.

  • @SofiaHerrero222
    @SofiaHerrero222 3 роки тому +3

    hi! i tried with my iphone 12 pro, add my ip but it doesnt find me in UE, u know how i can find my iphone there? thanks

    • @virtual_intel
      @virtual_intel 3 роки тому

      Try loading up the MetaHuman sample project before adding your custom character. The one from the web store aka Unreal Marketplace. Also you can try using both IP's that show up on your network properties area. Hope these tips help. As it's the only way I can get it to work.

  • @privet20005
    @privet20005 3 роки тому +1

    You managed to get the best lip sync out of metahuman yet!

  • @Andimax11
    @Andimax11 2 роки тому

    Thank you SO MUCH! Finally an ACTUAL tutorial... the Reallusion "tutorials" just showed you how to download some of the software and then skipped 10 steps. I got it up and running, but would you have any idea why my metahuman looks like half his face is numb?

  • @dabeerygoat
    @dabeerygoat 2 роки тому +4

    Thank you for the tutorial! I ran into a problem though with my MH. Only half of my MH face is being animated and it looks odd (i.e. smiling would make the face make a weird expression and doesn't match up). Does anyone know what could be causing this?

  • @ShadowsClub
    @ShadowsClub Рік тому +1

    When i speak it doesn't open mouth fully

  • @alexusa-zo3fn
    @alexusa-zo3fn Рік тому

    wooooaaaaaahhhhhhhhhhhh this just blew me away !

  • @levitabusman
    @levitabusman 2 роки тому +1

    When I export it doesn’t have the skin or textures

  • @CaseyChristopher
    @CaseyChristopher 2 роки тому

    I assume you would then use the Take Recorder to capture a performance that you could then add as an animation track in the Sequencer?

  • @donrivas8074
    @donrivas8074 2 роки тому +1

    Mines is just not linking at all what am I doing wrong?? This is the second video i follow till the linking then i just stop cause it's not linking with the iphone

  • @faddlewaddle2615
    @faddlewaddle2615 2 роки тому +1

    I can get this all to work BUT, what do I do with it after?
    How do I render a video from this?
    Every single time I hit the record button it crashes. I'm using UE5.0 btw and I've no clue if that's even the process for recording my animations from Live Link.
    Also managed to get something done from within the BP but couldn't do a darn thing with that animation file either.

  • @Cangel06
    @Cangel06 2 роки тому

    Wonderful!
    But when I try to do it, the movements are very slow and sometimes clogged.

  • @9ayadis
    @9ayadis 3 роки тому +4

    Keep it up, I'm exactly where you are, can you make a beginner tutorial for unreal especially for cinematic?

    • @AnimatorsJourney
      @AnimatorsJourney  3 роки тому

      Thanks for the suggestion, I'm working on something for that now, might be a few months. Stay tuned :)

  • @raxian_
    @raxian_ 3 роки тому +1

    When I set up the LiveLink ip thing, it wouldn't show up or work at all for me. Any help?

  • @EdwardMilliganBouwls
    @EdwardMilliganBouwls 3 роки тому +1

    Thanks! Does this app work with android also?

  • @tr-dg2iy
    @tr-dg2iy 3 роки тому +2

    can livelink connect to UE running on macbook pro? i have livelink running on an iphone, added my mac IP, but UE does not recognize that livelink is sending data to it (the iphone name does not show in UE). Lucas, thanks for making the video.

  • @saif0316
    @saif0316 2 роки тому

    Hi, I’m having an issue where my live link head separates from body? Any help would be appreciated

  • @louis.blythe
    @louis.blythe 3 роки тому

    Thanks for creating this super excited to jump in!

  • @Instant_Nerf
    @Instant_Nerf 2 роки тому

    How do you add an idle pose because I’m trying to make a video but with the body just stiff.. doesn’t look good but I don’t know how to add that animation.

  • @glatze_-.-
    @glatze_-.- 2 роки тому

    i would like to know from which iphone this is compatible? would an iphone 6 or 7 or 8 already be enough to operate this app?

  • @virtamay7311
    @virtamay7311 2 роки тому +2

    hi, great tutorial.
    I'm using cc3 character in ue4 with live face, and my face mocap works however I'm trying to add my pre-made body animation to my character. I want this body animation to loop while I'm controlling the face mocap through live face. I'm bad with blueprints. any tips? thanks

  • @shockerson
    @shockerson 3 роки тому

    Hey. Great video, wich Iphone u used? Wha you think about Iphone 12

  • @KriGeta
    @KriGeta 3 роки тому +2

    Amazing sir, one question, is it possible to control the eye ball moment with other models? and then how can we record and export the animation to Blender?

  • @joshuascott9598
    @joshuascott9598 3 роки тому +1

    Also, you should direct people to the settings that gives you an option to download 8k, 4k, or 2k. It seems the metahumans download at 8k res by default...which is fine if we're assuming everyone has terabytes of memory.

  • @blcgamer875
    @blcgamer875 3 роки тому

    Iphone 12 pro max can connect with live link???

  • @longsonfullmetal1856
    @longsonfullmetal1856 3 роки тому

    is there a way to do this without an Iphone but a webcam instead ? i dont want to buy an iPhone just so i can do this. please help.

  • @monadrian.official
    @monadrian.official 3 роки тому

    is it possible to connect with USB Cable Iphone to pc?

  • @tylerdurden-vevo
    @tylerdurden-vevo 2 роки тому

    Is a webcam used?

  • @Polytricity
    @Polytricity 2 роки тому

    I don't understand the mouthClosed blendshape, normally I use a bone for the jaw / mouth open. So, is mouth close design to compensate for the jaw bone opening the mouth and how are the eyes controlled as I usually have those skinned to bones too.?

  • @Deano-M
    @Deano-M 2 роки тому

    everytime i try to export from bridge. it says unreal isn't open. even tho it is open

  • @HassanAhmed-vs2fx
    @HassanAhmed-vs2fx 2 роки тому

    I’m working on PC and I have tried to link my phone with the PC to activate the live link but it didn’t work.
    How should I solve this problem?
    My PC has a wire connection and not a wireless connection

  • @gryphonsegg
    @gryphonsegg 3 роки тому

    min will connect. I even tried the trick of turning the sequence off and on and setting it to none. What am I doing wrong?

  • @val_evs
    @val_evs 3 роки тому +1

    the head is static, it looks weird, the movement of the head is not enough, how to do it?

    • @AnimatorsJourney
      @AnimatorsJourney  3 роки тому +1

      setting in the app to turn on head rotation tracking

    • @val_evs
      @val_evs 3 роки тому

      @@AnimatorsJourney Thanks!

  • @NeerajIngle
    @NeerajIngle 3 роки тому

    This is awesome, I am surely gonna try today, was wondering will my iPhone 8Plus will work?

    • @dushyantm9579
      @dushyantm9579 3 роки тому

      iPhone 10 is probably the lowest you need as it has the depth sensor front facing camera.

  • @Kanermi
    @Kanermi 2 роки тому

    Can Live link work on a dedicated server, or only on a package?

  • @Babakkhoramdin
    @Babakkhoramdin 2 роки тому

    what is your iPhone model? please

  • @cesarmartinez1947
    @cesarmartinez1947 2 роки тому +2

    Hi great tutorial. I have a question when I try to record the sequence my UE crash. Anyone have this same issue?

    • @ShortsforLife12
      @ShortsforLife12 2 роки тому

      Yes, i tried it like 100 times.
      Do you got a Solution?

  • @sikfreeze
    @sikfreeze 3 роки тому +2

    Wow this is amazing. Can live link also track the body movement as well?

  • @portfolioyzadora
    @portfolioyzadora 3 роки тому

    only works with iphone?????

  • @shockerson
    @shockerson 3 роки тому

    Hi i have question, is There eyeball movment with iphone live link, answer here please and do short demo if possible , many thanks. Im gona to buy iphone There is a much Money for me. I want to know abt eyeball movment.

  • @tayam92
    @tayam92 2 роки тому

    Hi, do you have a video on using Live Link on Paragon Characters? Or can Paragon Characters use Live Link?

  • @3d_4_dummies
    @3d_4_dummies 2 роки тому +1

    Nice tutorial!

  • @edgarprotsko1558
    @edgarprotsko1558 2 роки тому

    How to enable real time head and neck rotation mocap ? Is it possible at all?

  • @thedadwars
    @thedadwars 10 місяців тому

    Thanks for this!

  • @GrimGearheart
    @GrimGearheart 3 роки тому

    I wonder if this can be used with a non-human? Or a highly stylized one?

  • @luckyenam6329
    @luckyenam6329 3 роки тому

    Hello I followed your process and downloaded the character from MetaHumans but haven’t been able to export the character from the quiver bridge or connect my iPhone to the Live Link through my IP address please do yo have a fix for this

  • @robot_collective
    @robot_collective 3 роки тому +2

    Awesome, that was very helpful THANKS! Had the problem to see the live tracking in the viewport. That works fantastic now.

  • @GrimGearheart
    @GrimGearheart 3 роки тому

    Can this only be used with an iphone?

  • @SProj-px7wm
    @SProj-px7wm 2 роки тому

    Thanks! Do you know how to record neck/head movement as well?

  • @danielshamota
    @danielshamota 2 роки тому

    Hey, do you know why in UE5 facial livelink skeleton animation is not updating in editor? Tried literally everything, including the hack showed in this video
    thanks

  • @BryantRicart
    @BryantRicart 2 роки тому

    Can you explain how to add head rotation

  • @derf0007
    @derf0007 2 роки тому +1

    Is it possible to record in the live link app and then connect to Unreal Engine later or does it have to be a live connect setup?

    • @V4NDLO
      @V4NDLO 2 роки тому

      I have the same question. Did you ever find out?

    • @derf0007
      @derf0007 2 роки тому

      @@V4NDLO no one ever responded. However, I think if you record the video ahead of time,, if you do eventually get the live link setup, you can face your phone or camera at the pre recorded video screen on a TV or computer and it'll pickup that face recording.

  • @maevick2442
    @maevick2442 3 роки тому

    Perdona, ¿Pero se puede para android?

  • @PunxTV123
    @PunxTV123 3 роки тому

    can I use it on iphone se 2016? or need new iphone?

  • @1amsaint581
    @1amsaint581 3 роки тому

    I've followed the instructions but for some reason my Metahuman is missing all face textures besides the eyes even after they've compiled. How do I fix this?

  • @InspectorXyto
    @InspectorXyto 3 роки тому +2

    Thank you for sharing that. I managed got it workin. Any idea what I can use to track head movement?

    • @AnimatorsJourney
      @AnimatorsJourney  3 роки тому

      This does it as well, it's just an option to turn on in the app settings

    • @ViensVite
      @ViensVite 2 роки тому

      @@omgee8968 u need a tracker on ur head

  • @Andimax11
    @Andimax11 2 роки тому

    Does anyone know why literally every metahuman skeletal mesh I use doesn't utilize the right side of their face?

  • @dotapodtv9449
    @dotapodtv9449 3 роки тому

    nice video!
    btw, why does my live link get a warning : 30.0 fps message? it's an iPhone X :X

  • @Michaduo
    @Michaduo 2 роки тому

    What Iphone do you have? X?

  • @rudy552
    @rudy552 Рік тому

    what iphone do you have?

  • @UncleTiaoTiao
    @UncleTiaoTiao 2 роки тому

    the face animation can work , but the last step, can not link to the animation~ e

  • @Silpheedx
    @Silpheedx 2 роки тому +1

    THANK YOU SO MUCH!

  • @ingeonsa
    @ingeonsa 3 роки тому

    How does it cope with prescription glasses?

  • @dandylion-evn7w2
    @dandylion-evn7w2 3 роки тому

    How much is it?

  • @AS-jf2mf
    @AS-jf2mf 3 роки тому +3

    why do I not get the subject even though I write my proper IP?

  • @deadmikehun
    @deadmikehun 2 роки тому

    Will this work with a ryzen 3700X, 32GB DDR4 and a GTX 1070?

  • @brianroanhorse5274
    @brianroanhorse5274 3 роки тому +1

    How would you lip-synced animate a pre-recorded audio? Would you do that in Maya?

    • @AnimatorsJourney
      @AnimatorsJourney  3 роки тому

      That's how I would do it personally, in Maya. I've got another video that shows how.

  • @jamalqutub_instrument
    @jamalqutub_instrument 3 роки тому +1

    Great tute. But for some reason the textures aren't coming in. Any ideas on why?

    • @AnimatorsJourney
      @AnimatorsJourney  3 роки тому +1

      They're not visible in UE? Is it just a matter of waiting for it to 'compile shaders'?

    • @jamalqutub_instrument
      @jamalqutub_instrument 3 роки тому +1

      @@AnimatorsJourney The shaders eventually compiled. Success!

  • @narikragam
    @narikragam 3 роки тому

    Can we make live in tiktok or in Facebook using this method with being meta human

  • @sp33dkill
    @sp33dkill 3 роки тому

    Excellent video! Just discovered this amazing software. still a total noobie. one question about Livelink - is it compatible with Android mobile devices as well as iPhone?

  • @mikey3d912
    @mikey3d912 3 роки тому

    What has to be the format of the model if I want to totally build one from scratch for my project?

  • @georgefelner
    @georgefelner 3 роки тому

    i want to be able to either assign keyboard keys to force expressions that live link isnt seeing? or is there a way to make UE4 more sensitive to live link so as to deliberately exagerate the expressions? any help would be appreciated and happy to pay for your time?

  • @shakeelshahid3145
    @shakeelshahid3145 2 роки тому

    Sir, how we can create live link face in Autodesk Maya???

    • @AnimatorsJourney
      @AnimatorsJourney  2 роки тому

      Purchase Rokoko face capture license - live link is specific to unreal.

  • @georgefelner
    @georgefelner 3 роки тому

    im having issues using UE with my iphone on Live Link. I used the correct IP.. UE sees it. but when i click play it says "facetracking not supported on this device" but Im using an iPhone X. Can anyone help?

  • @Andimax11
    @Andimax11 2 роки тому

    Can anyone tell me why my iphone might not be showing up in UE 4.27.2 under the LLink Face Subj (iphone black)? It worked fine in UE5 other than the mocap animations being janky on one side of the face (this is a commonly known issue with this metahuman process in UE5, I'm just waiting for someone to fix the bugs). Since UE5 mocap was janky, I wanted to try it in UE4.27.2, but now my phone won't show up no matter what I do. I have restarted the live link iphone app several times, restarted unreal engine, restarted my computer, double and triple checked I am using the correct IPv4 address, switched my network from public to private and vice versa and I'm just at a total loss as to what I should do. Does anyone have any suggestions for either my issue with UE5 or UE4?

  • @LordRubino
    @LordRubino 3 роки тому

    Great tutorial Luca. I just wondering why the quality of the capture in every tutorial i saw are not as good as the demo from Epic. I wonder what sort of device they use for mocap. I'm not convinced that creators could use that tool successfully with only a mobile phone. Thank you for the tutorial anyway. :)

  • @gigxr
    @gigxr 3 роки тому

    can u move the head/neck with it?

  • @PsychotronAyax
    @PsychotronAyax 3 роки тому

    can it also be done with an aribook?

  • @Austin1-0-8
    @Austin1-0-8 3 роки тому +6

    Use this for VR and make it your avatar so we can be ourselves in VR

  • @gameswownow7213
    @gameswownow7213 3 роки тому +1

    and i've seen others have whole body movement as well with the facial animations, have any ideea how?

    • @AnimatorsJourney
      @AnimatorsJourney  3 роки тому +3

      They have a mocap suit. Perception Neuron and Xsens are two companies that make them.

  • @joshuascott9598
    @joshuascott9598 3 роки тому

    One last thing...the Live Link app does not work on iphone 8 with the latest iOS 14.6. The error reads "Live Link Face requires a device with a TrueDepth camera for face tracking".

    • @aaronambrose1006
      @aaronambrose1006 3 роки тому

      Yeah, the app utilises ARKit which in turn requires the right hardware. iPhone X and above works.

  • @mass23
    @mass23 2 роки тому

    Thanks yo so much!!

  • @bobatea4732
    @bobatea4732 3 роки тому

    I don't know how to use unreal or metahuman, but this is pretty cool

  • @vertex-6714
    @vertex-6714 3 роки тому

    Hello I have a question. Which iphone model do I need to run LiveLink? Are there any special requirements(sensor?)

    • @ViensVite
      @ViensVite 2 роки тому

      12pro or max, 13 pro or max, same for 14

  • @henriquemodena6826
    @henriquemodena6826 3 роки тому

    Please make the MetaHuman Live Link Kit available? I can't buy it! I even managed to find some files I didn't find the rl_funciontion_lib file

    • @AnimatorsJourney
      @AnimatorsJourney  3 роки тому

      I don't know what you're referring to, I don't sell that.

  • @OmegaMouse
    @OmegaMouse 3 роки тому

    The mouth not closing was a bug in the latest version of Faceshift before Apple bought them for 200 million dollars. Seems to be they don't know how to fix it. Pretty lame!

  • @5gradeproductions519
    @5gradeproductions519 2 роки тому

    Is live link is free ?

  • @andrejbykov4917
    @andrejbykov4917 2 роки тому

    Is anyone knows why only neck moving and the face not reacting at all?

  • @freenomon2466
    @freenomon2466 3 роки тому

    thanks for sharing. I wonder if we have access to the arkit blendshapes that we can tweak inside unreal

    • @AnimatorsJourney
      @AnimatorsJourney  3 роки тому

      This may be helpful: twitter.com/epicchris/status/1385422520606679042?s=20

    • @freenomon2466
      @freenomon2466 3 роки тому +1

      @@AnimatorsJourney awesome. Following you on twiter now. :) I saw your other recent video on mh(metahuman) to maya then to UE. This will let me work on the facial animation via facial mocap as first pass then manually finesse it more in maya then export to unreal to render. Which eliminates the need to do higher quality livelink to unreal (current livelink to ubreal has pretty bad quality facial animation).
      do you have any courses on lipsync/facial animation workflow for maya?

  • @mindped
    @mindped Рік тому

    recording doesnt work.

  • @DanielDiaz-rf5lf
    @DanielDiaz-rf5lf 3 роки тому

    LIVE LINK IS JUST FOR IOS?