Metahuman Animator Tutorial

Поділитися
Вставка
  • Опубліковано 29 тра 2024
  • Metahuman Animator Tutorial
    JSFILZM Mocap Helmet: • Cheap Mocap Helmet for...
    Mocap Waitlist: JSFILMZMOCAP AT GMAIL DOT COM
    Grab my new Unreal Engine 5.1 Course here! Be sure to share it with everyone!
    Link to lighting course: www.artstation.com/a/25961360
    Link to How to make a movie in UE5.1 www.artstation.com/a/22299532
    jsfilmz.gumroad.com/l/lmaqam
    My Realistic Warehouse VR Demo: www.artstation.com/a/27325570
    My Fortnite Map: 3705-9661-2941
    Join this channel if you want to support it!
    / @jsfilmz
    Sign up with Artlist and get two extra months free when using my link below.
    Artlist
    artlist.io/artlist-70446/?art...
    Artgrid
    artgrid.io/Artgrid-114820/?ar...
    @UnrealEngine #unrealengine5 #metahumananimator #metahuman
    Metahuman Animator Tutorial,unreal engine metahuman animator tutorial,How to Use MetaHuman Animator in Unreal Engine,unreal engine 5.2 metahuman animator,unreal engine metahuman animator app,unreal engine 5 metahuman animator tutorial,metahuman animator tutorial,metahuman animator release date,unreal engine 5 metahuman animator release date,metahuman animator unreal engine 5,metahuman animator unreal engine,unreal engine 5.2 metahuman animator tutorial,jsfilmz
  • Фільми й анімація

КОМЕНТАРІ • 294

  • @Jsfilmz
    @Jsfilmz  11 місяців тому +6

    ITS ON LINKEY DONKEYYYYY! Here is video of the animator app studio.ua-cam.com/users/videoiXH79mrKADM/edit
    First rap test ua-cam.com/video/mAT1X4xbEdA/v-deo.html

    • @jimmwagner
      @jimmwagner 11 місяців тому +1

      Do you need the depth capture in Live Link or can you just use video?

  • @MR3DDev
    @MR3DDev 11 місяців тому +9

    Bruh! It is nuts how accurate this thing is.

  • @lukewilliams7020
    @lukewilliams7020 11 місяців тому +7

    Yes so glad to hear you’ll be doing more tests with this. Bro just realised I’ve been on this unreal journey with you since your channel was a baby. You’re literally the don in this field. Keep em coming

  • @PrintThatThing
    @PrintThatThing 11 місяців тому +3

    Incredible! You're so ON IT. Thanks for the walk-thru. Very helpful!!!

  • @MiguePizar
    @MiguePizar 11 місяців тому +2

    Thanks for the tutorial, I can't wait to try it, although it will take hours since the shorts that I'm doing have a lot of dialogue, but it will look a lot more realistic. Best

  • @MANIAKRA
    @MANIAKRA 11 місяців тому +1

    BIG! I was wanting to explore this, thanks for the tutorials!

  • @jimmwagner
    @jimmwagner 11 місяців тому +1

    Knew this was coming the second I saw the Unreal post about this. Excited to try this out today.

  • @leonaraya2149
    @leonaraya2149 11 місяців тому +4

    Bro, you are a master!
    And the way that Unreal it's improving this technology day by day it's amazing.

    • @Jsfilmz
      @Jsfilmz  11 місяців тому +1

      thx dude!

  • @ohheyvoid
    @ohheyvoid 11 місяців тому +2

    You're amazing. Thank you for staying on the pulse!

    • @Jsfilmz
      @Jsfilmz  11 місяців тому +1

      thanks for being here!

  • @pokusalisobaki
    @pokusalisobaki Місяць тому

    Thank you so much, its amazing

  • @DLVRYDRYVR
    @DLVRYDRYVR 11 місяців тому +2

    7:00 I'm listening to this video driving around, windows down, level 20 on Bluetooth and at a Bus Stop with little old ladies 🤦‍♂️

  • @kool-movies
    @kool-movies 11 місяців тому +1

    just done my first quick test following your tuts to guide me through the process, got to say wowzer animator is amazing. so much cleaner then i expected. well worth the iphone rental :}. keep the vids coming :}

    • @Jsfilmz
      @Jsfilmz  11 місяців тому

      oof someones bout to buy an iphone 😂

  • @2beornot2bable
    @2beornot2bable 11 місяців тому +1

    Great job!

  • @user-ct8my8rv9c
    @user-ct8my8rv9c 11 місяців тому +3

    Good and fast tutorial, nice man

  • @moneybagvr2113
    @moneybagvr2113 11 місяців тому +3

    You the man bro

  • @sakibkhondaker
    @sakibkhondaker 11 місяців тому +4

    Some day i think you will make tutorials before release. haha. So quick!!!!!!!!!!!

    • @Jsfilmz
      @Jsfilmz  11 місяців тому +3

      hahaha yea they didnt select me for beta access unfortunately i know some did so im gonna have to catch up to them

  • @christiandebney1989
    @christiandebney1989 11 місяців тому +1

    Awesome.. Randomly woke up at 4am knew there was a reason... thank you!

  • @D3Z_animations
    @D3Z_animations 10 місяців тому

    Awesome video! Thanks for the quick setup explanation! But I have to point out that MetaHuman Animator already uses AI for its solves.
    When you hit the "Prepare for performance" button, it trains a model on your face to later mimic the way it moves so it can animate other metahuman characters to that likeness. Thats why this step took 8-10 minutes : )

  • @JoshPurple
    @JoshPurple 11 місяців тому +1

    LUV'd your last vid with your little girl (all of your vids ❤ ) 🏆😁👍!

    • @Jsfilmz
      @Jsfilmz  11 місяців тому +1

      hahah thx man it got 20k views on twitter hahah

  • @3DComparison
    @3DComparison 11 місяців тому +1

    Great stuff!

    • @Jsfilmz
      @Jsfilmz  11 місяців тому

      Thanks!

  • @IDAVFX
    @IDAVFX 11 місяців тому +1

    Thanks Lord Helmet!!!

  • @StyleMarshall
    @StyleMarshall 11 місяців тому +2

    sooo quick , Dude ! you are fast as hell .... 😁

    • @Jsfilmz
      @Jsfilmz  11 місяців тому

      i woke up 5 am today haha

  • @lukewilliams7020
    @lukewilliams7020 11 місяців тому +1

    The accuracy looks insane

  • @muviing5427
    @muviing5427 11 місяців тому

    Thank you for sharing a fun and essential tutorial!! Anyway, is there a way to use the neck animation recorded by Unreal Live link facial capture? When I imported the facial anim with neck animation, and apply to my metahuman skeleton, the body and face is broken because of the neck anim. It's also possible to just facial anim without neck movement, is there a way to use neck anim...? Is there just one solution using the mocap data (with neck anim) + facial capture anim just face movement(without neck anim)...?

  • @marcusjordan8036
    @marcusjordan8036 11 місяців тому

    This is awesome! Would I be able to take the metahuman/face mocap and export it into blender? I don't need any of the textures or materials. Thanks!

  • @Pauliotoshi
    @Pauliotoshi 11 місяців тому +2

    Great showcase! In case you plan to make more test video's, can you show expressions that are hard to do with Apple Arkit? I'm curious how it compares.
    For example a sad face with hanging lower lip, asymmetric brow movement🤨, worried face 😟 or any interaction between teeth, lips and tongue.

    • @Jsfilmz
      @Jsfilmz  11 місяців тому +2

      man im a terrible actor but ill try

    • @Pauliotoshi
      @Pauliotoshi 11 місяців тому +2

      @@Jsfilmz Thanks a lot!

  • @lili-ro7cw
    @lili-ro7cw 11 місяців тому

    Thank you so much for your tutorial, so timely. In addition, I am trying to import from the mesh body in UE5.2, and it seems that the tracking mark link can no longer be carried out. Have you encountered it?

  • @DariuszMakowski
    @DariuszMakowski 7 місяців тому

    Have u done any vids how to take that head with animation and apply to another metahuman ?

  • @HQLNOH
    @HQLNOH 11 місяців тому +2

    Hey! I just wanted to say thank you for your videos! If it wasn't for you I couldn't have created animation for my metahumans. ❤ Thank you

    • @Jsfilmz
      @Jsfilmz  11 місяців тому

      That is awesome!

    • @HQLNOH
      @HQLNOH 11 місяців тому +1

      Omg yes! My video will never make it if it wasn't for you! I even mentioned you in appreciation in the description ❤️

    • @Jsfilmz
      @Jsfilmz  11 місяців тому

      ​@@HQLNOHthanks man not manu give credit back to me i appreciate it

  • @alexandredonciu-julin5969
    @alexandredonciu-julin5969 11 місяців тому

    Thanks for the tutorial! Do you have a recorded take file I can use as a test? I dont have an iphone. Thanks

  • @Fotenks
    @Fotenks 11 місяців тому +1

    I Hope that behind the scenes they are working on full body mocap

  • @MichaelHurdleStudio
    @MichaelHurdleStudio 11 місяців тому +1

    This was jammed packed with so much information. I have to pause this video and slowly follow all of the steps. This is amazing! I'd like to purchase the headset for my iPhone. Did you make it yourself? or did you get it from a website?

    • @Jsfilmz
      @Jsfilmz  11 місяців тому

      home made bro with my sweat and blood its not $129 anymore though its $169 now

  • @natanaelgauna3600
    @natanaelgauna3600 9 місяців тому

    Amazing content as always!
    Could you please make a video on troubleshooting these three issues:
    - “Promote Frame” randomly jumping to a different frame than the selected one.
    - Metahuman Identity Solve not accurate result.
    - “Add Teeth Pose” breaking the Identity Solve even more.
    Thanks a lot!!!!!

  • @TetraVaalBioSecurity
    @TetraVaalBioSecurity 11 місяців тому +1

    A lost detail in Hideo Kojima's DS2 trailer, was the end saying the performance-capture is powered by Metahuman.
    That game is going to be insane levels of detail.

    • @Jsfilmz
      @Jsfilmz  11 місяців тому

      yea hideo was lurking around on my channel when ue5 first came out i was one of the first ones to cover it

  • @supermattis108
    @supermattis108 8 місяців тому

    So, whenever I want to make a facial animation (of myself doing it with live link) I need to go through these steps? But they can be used on every metahuman?

  • @terry9183
    @terry9183 11 місяців тому

    You're absolutely smashing it man 👏🏽 Thank you so much for the awesome content!
    I do have one slight issue though!..
    For the life of me, I cannot get the Livelink Face mocap to work with separate body mocap. The head/chest just detaches itself and they are both independent. It's driving me insane. I've tried following some advice on the UE forums to no avail 😭
    Have you experienced this yet? any tips?
    Thank you

  • @josiahruelas498
    @josiahruelas498 11 місяців тому +1

    Great video man! I'm getting an error when I hit the process button in the Metahuman Performance. It says "The Processing Pipeline failed with an error." Any ideas on how to fix this would be appreciated. Thanks!

  • @oldmatttv
    @oldmatttv 11 місяців тому +1

    This really is such a big step forward. Very exciting. The only problem is the iPhone only requirement. That makes no sense for something like this, but let's hope it changes sooner or later.

    • @Jsfilmz
      @Jsfilmz  11 місяців тому +1

      beats buying a real mocap system 😂

    • @oldmatttv
      @oldmatttv 11 місяців тому

      @@Jsfilmz True I suppose :)

  • @veithnurtsch1069
    @veithnurtsch1069 11 місяців тому +2

    I need to buy iPhone 12 + first hahaha

    • @Jsfilmz
      @Jsfilmz  11 місяців тому +3

      looks like 11 works

    • @aaagaming2023
      @aaagaming2023 11 місяців тому +2

      @@Jsfilmz In their docs they group the X, 11 and 12 together and then 13 and 14 together. Then they give a caveat about the X, saying that its not capable of capturing more than a few sec, but they dont say that about the 11, so yeah, I think the 11 is capable of capturing longer form content like the 12.

    • @andrewwelch5017
      @andrewwelch5017 11 місяців тому

      @@aaagaming2023I tested the iPhone 11 Pro last night and it works fine, zero issues.

  • @Aragao95
    @Aragao95 11 місяців тому +3

    does someone got it working good with iphone 11? the new epic post says it needs aleast iphone 12..

    • @Jsfilmz
      @Jsfilmz  11 місяців тому +1

      try it broski

  • @Stonefieldmedia
    @Stonefieldmedia 11 місяців тому +1

    ...and off we go!

  • @binyaminbass
    @binyaminbass 11 місяців тому +1

    when doing facial mocap, what do you do for a mic? do you have a little one that you attach to your helmet? what kind of mic is goof for that?

    • @Jsfilmz
      @Jsfilmz  11 місяців тому

      just my good ole senheisser g2

    • @binyaminbass
      @binyaminbass 11 місяців тому

      @@Jsfilmz do you attach it to the arm of your helmet?

  • @benblaumentalism6245
    @benblaumentalism6245 8 місяців тому

    I wonder how long a given take can be. I have to make some EDU products with this, and they might need to be on the long-ish side.

  • @PecoraSpec
    @PecoraSpec 2 місяці тому

    can i move the mocap data to another software, like blender?

  • @prodev4012
    @prodev4012 11 місяців тому

    Do you think there will be a marketplace to buy mesh to meta human data scans eventually? For example I don't know any korean girls to run this on with an iphone like that company did but i'd be willing to pay people who know attractive people from every race if they did

  • @dmingod999
    @dmingod999 10 місяців тому

    Did you see issues of floating head? Face animates and body remains still.. if neck rotation is disabled it's fixed but the neck rotation part of the capture is lost.. Do you know how to avoid that? Thanks! 🙂🙏🏻

  • @michaelb1099
    @michaelb1099 Місяць тому

    is there a way to create the captures from a webcam?

  • @michaeleansan2730
    @michaeleansan2730 11 місяців тому +1

    crazy times!

  • @guilhermenunes7075
    @guilhermenunes7075 10 місяців тому

    4:43 "Iden-TITTIES" hahahahahaha
    Great video man! Thank you for the content.

  • @dzaariqolbiiakbarqowli3433
    @dzaariqolbiiakbarqowli3433 11 місяців тому

    Hi! It's amazing video. I try to import my recording video using iPhone 11 but, in the capture manager the video can be reading. Any suggest for it?

    • @Jsfilmz
      @Jsfilmz  11 місяців тому

      firewall?

  • @DLVRYDRYVR
    @DLVRYDRYVR 11 місяців тому +3

    Hope my XS works. Don't want any more Apple pradux

    • @IamSH1VA
      @IamSH1VA 11 місяців тому +2

      Please try it & please report if it works, I have same iPhone XS.
      I am not gonna have access to windows system for at least 15 days, but I am dying to test this feature.

  • @sibinaayagam1838
    @sibinaayagam1838 11 місяців тому

    If I'm not wrong, you should be turning off the neck solving on a headmounted camera. And use it only for static cameras.

    • @Jsfilmz
      @Jsfilmz  11 місяців тому +1

      in my case ill use neck movements from my mocap the exported sequence doesnt come with neck check out new rap video i uploaded

  • @TheChrisNong
    @TheChrisNong 10 місяців тому +1

    I get the "assertion failed" crash every time I promote the first frame... what should I do?

  • @Griff_The_5th
    @Griff_The_5th 11 місяців тому +1

    Do you think it's worth shelling out a bit more money for the iPhone mini 13 over 12?

    • @Jsfilmz
      @Jsfilmz  11 місяців тому +1

      im broke so ur askin wrong person

  • @Persianprograph
    @Persianprograph 11 місяців тому +1

    Incredible! Is there a way to export Animation data to Maya to have more freedom for further tweaks?

    • @Jsfilmz
      @Jsfilmz  11 місяців тому

      that would be amazing but i dont think thats possible yet

    • @lukepeterson9081
      @lukepeterson9081 11 місяців тому

      This guy made a script to do it ua-cam.com/video/ecYO5-5fL0U/v-deo.html

  • @itsyigitersoy
    @itsyigitersoy 3 місяці тому

    Hi, i asked this to a lot person but couldn't get appropriate answer. After creating identity and perform with it, my Metahuman Character's default lips and teeth are changing, it's because of getting my identity. Is it possible to keep animation exactly same but with character's default facial features?

  • @Blairjones3d
    @Blairjones3d 8 місяців тому +1

    I may be a little late to the party, but im confused why you choose in CAPTURE SOURCE LivelLink Archives (which is uploading footage etc from PC drive?) then when you go to import you do it from the iPhone in CAPTURE MANAGER?

    • @Jsfilmz
      @Jsfilmz  8 місяців тому +1

      i made tutorial both ways my usb transfers faster for bigger files

    • @Blairjones3d
      @Blairjones3d 8 місяців тому

      @@Jsfilmz oh sweet as no worries bro. Any tips on uploading from USB then using archives? Seems more complicated to setup the Performance that way...

  • @naytbreeze
    @naytbreeze 11 місяців тому

    Having a huge problem when I try to add the animation to my metahuman in the sequencer it becomes detached from the body from around the shoulders area. Was it because I may have moved too much in the capture? Can’t seem to get the body and the head attached

  • @prasoondhapola2875
    @prasoondhapola2875 11 місяців тому

    I don't have an iPhone. Will it work with my 2021 iPad pro ?

  • @sybexstudio
    @sybexstudio 4 місяці тому

    Did everything to the t and my animation sequence doesn't show up in the Face animation menu. Any tips?

  • @omri1324
    @omri1324 9 місяців тому

    What about body movement and hands?

  • @JonCG
    @JonCG 11 місяців тому

    Hello bro.I just went back to this video ,becasue for me when update to 5.2.1 its says Preparation for Performance Fialed,any tips on it bro.

  • @jaykeastle8804
    @jaykeastle8804 11 місяців тому +1

    BOSS!

  • @juanmaliceras
    @juanmaliceras 11 місяців тому +1

    it´s here!!!!! downloading plugin!!!!!

  • @Va.bene.
    @Va.bene. 11 місяців тому +1

    hey bro! awesome!
    I have a little issue, when I track my face with the animation sequence it's like nothing happend in my sequencer, but it exported correctly because when I open it alone the animation is fine. any idea?
    Thank you!

    • @Jsfilmz
      @Jsfilmz  11 місяців тому

      i dont understand :( maybe join unreal discord and post pics and issue there

  • @ResonanceRebirth
    @ResonanceRebirth 4 місяці тому

    JS can that be done with DAZ characters too?

  • @burov.a.4467
    @burov.a.4467 11 місяців тому +1

    Hi! Thanks for the cool video. Listen to the question, will it work if you upload a regular video shot on a camera?) just no iphone))

    • @Jsfilmz
      @Jsfilmz  11 місяців тому

      dont think so iphones are precalibrated like i showed in the performance editor

    • @andrewwelch5017
      @andrewwelch5017 11 місяців тому

      No, the software requires depth information which a regular camera can’t provide. Borrow an iPhone 11 or newer.

  • @ArabicTechAILab
    @ArabicTechAILab 2 місяці тому

    when I play the level in a new window (PIE)
    the Metahuman character is moving his face with my motions very good, but when I click in any window other than the PIE window it starts to lag. and when I click again on the PIE window, the character moves normally !
    what should I do ?

  • @kool-movies
    @kool-movies 11 місяців тому +2

    epic insane , looks much better then faceware or livelink .like you say the curves look smooth and no jitter . and to think i spent 4 hours last night doing 30 sec of manual facial animation that looks rubbish. so if i get a friend who has iphone they can send me clips ? .

    • @Jsfilmz
      @Jsfilmz  11 місяців тому

      yes

    • @kool-movies
      @kool-movies 11 місяців тому +1

      @@Jsfilmz awesome ,, is there any reason not to get the iphone 12 mini ?

    • @Jsfilmz
      @Jsfilmz  11 місяців тому

      @@kool-movies i havent tested longer takes with 12 mini yet but it used to overheat on me alot hahaha

    • @kool-movies
      @kool-movies 11 місяців тому

      @@Jsfilmz your rap video is a long take so hopefully will be good , i just ordered/renting a cheap refurbished one. hopefully arrives tomorrow. :} the 3lateral video that just released is mind blowing.assume they used a stereo camera ?,

  • @lilylikesherpotato3461
    @lilylikesherpotato3461 6 місяців тому

    add get hangs when I try to bake the animation....???

  • @emotionamusic
    @emotionamusic 11 місяців тому +1

    Super sick! Do you feel like the results are significantly better than the live stream app?

    • @Jsfilmz
      @Jsfilmz  11 місяців тому +1

      bro go watch my rap with it it will answer your question

  • @UnrealEnginecode
    @UnrealEnginecode 11 місяців тому +2

    after ubdate iphone app It gives a warning for the animation part of the application live link face app saying your device model is unsupported you may continue but your results could be affcted"for meta human animator capture ." and its work but i wonder My economic situation is not good and I will develop games. How much performance difference does the iPhone 11 make?

    • @andrewwelch5017
      @andrewwelch5017 11 місяців тому +1

      The software requires an iPhone with a “True Depth” (LiDAR) sensor because it needs depth data to accurately track your face. You can always borrow an iPhone 11 (or newer) to do the tracking and then transfer the file to your computer.

    • @UnrealEnginecode
      @UnrealEnginecode 11 місяців тому

      thank you for your answer, what are they doing this warning for, do you think there will be a noticeable quality difference?

    • @andrewwelch5017
      @andrewwelch5017 11 місяців тому

      @@UnrealEnginecode The quality I got was excellent so I'm not worried about it.

  • @OfficialCloverPie
    @OfficialCloverPie 11 місяців тому

    my head keeps detaching from the body when i attach the animation to the face and play it in sequencer

  • @inrapture
    @inrapture 10 місяців тому

    is pasible only with iphone?

  • @JoshuaLundquist
    @JoshuaLundquist 9 місяців тому +1

    Also hey I'd like one of those helmets, but I do have an iphone 11, will that work? It's bigger than the mini, of course.

    • @Jsfilmz
      @Jsfilmz  9 місяців тому +1

      i know some people whos tried 11 with MHA and they said it works i havent tested it myself

    • @JoshuaLundquist
      @JoshuaLundquist 9 місяців тому

      Gonna buy an iphone 12 mini like you have, can you tell me which mount you use so I can buy that and yr helmet? @@Jsfilmz

  • @JDRos
    @JDRos 5 місяців тому

    So how do we connect this to a body so we can add animations to it? 😅

  • @user-mp8sx1gy2m
    @user-mp8sx1gy2m 7 місяців тому

    how to add my facial animation to different character

  • @ielohim2423
    @ielohim2423 11 місяців тому +1

    Great vid!! In the Showcase ,didnt they show a way you could use this app to generate textures for your metahuman? Will you be showing us how as well?

    • @Jsfilmz
      @Jsfilmz  11 місяців тому +1

      can you send me that video? i dont think i saw that

    • @ielohim2423
      @ielohim2423 11 місяців тому

      @@Jsfilmz I think I may be mistaken. I thought the HellBlade II showcase did it ,but I think they may have had a premade metahuman.

    • @ielohim2423
      @ielohim2423 11 місяців тому

      @@HellMunky What's your workflow? What do you use to generate the mesh/textures that you import into UE to use in the plug-in?

  • @AllanMcKay
    @AllanMcKay 11 місяців тому +2

    Great video - you mentioned about when MHA will later use AI. Just a heads up it is AI driven currently. Pixel tracking is only a small part of the foundation.
    Great video!

    • @Jsfilmz
      @Jsfilmz  11 місяців тому +1

      wait like its doing ai pose estimations already?

    • @AllanMcKay
      @AllanMcKay 11 місяців тому +2

      @@Jsfilmz yeah there’s a lot under the hood that’s ML driven already. Facial tracking doesn’t account for wrinkles or much else other than eye and mouth shapes, everything else is interpolated with AI. there’s more coming but the foundation is already utilizing AI in a lot of areas.The second pass animation is still being improved on, so it’ll continue to get better. But it’s a training model based on a lot of human facial animation, to know what to do when cheeks are raised, nostrils flared, eyebrow wrinkles etc
      Night and day different to something like live face or other tools which purely track eye and mouth shapes and don’t leverage any AI to them interpolate wrinkles and pseudo face muscles into the animation

    • @Jsfilmz
      @Jsfilmz  11 місяців тому +1

      ​@@AllanMcKayoh wow hahaha crazy stuff being 1.0 its not bad oh btw for iphone it can only output 30 even when recording 60 right? Thanks i love knowing about the tech

  • @matteo.grossi
    @matteo.grossi 11 місяців тому +1

    Hey J, for some reason the result of your test is not very good, compared to other tests I have seen online. Do you reckon there was something in the configuration/shooting conditions that interfeered? Or perhaps the other tests used other types of cameras, such as stereo cams, rather than iPhone? Thanks for the tut though.

    • @Jsfilmz
      @Jsfilmz  11 місяців тому

      i think the demo videos that came out were done with stereo cams not sure

    • @matteo.grossi
      @matteo.grossi 11 місяців тому

      @@Jsfilmz The rig used when they announced MHA wasn't a metahuman rig, in fact they only showcased how the new system works on metahuman rigs for like, 5" at the end of the presentation.

    • @Jsfilmz
      @Jsfilmz  11 місяців тому +1

      @@matteo.grossi hahaha thats cheating then right lol

  • @SkyHandOneTen
    @SkyHandOneTen 11 місяців тому +1

    Can you do a video of how to warp a metahuman to look like a custom character? RS3D Zwrap is a good wrapper. I have this model of NAS I want to put to the test, as well as Michael Jordan and the Rock.

    • @Jsfilmz
      @Jsfilmz  11 місяців тому

      just mesh to metahuman it mang save u the headache hee hee

    • @SkyHandOneTen
      @SkyHandOneTen 11 місяців тому +1

      @@Jsfilmz Hmmm, I guess I'm a bit behind. Not sure what that is or where it is. I'll look it up. Thanks

    • @Jsfilmz
      @Jsfilmz  11 місяців тому +1

      ​@@SkyHandOneTenoh yea man mesh to metahuman jsfilmz look it up its easy

    • @SkyHandOneTen
      @SkyHandOneTen 11 місяців тому +1

      Seems that info is everywhere I was just looking up the wrong terminology all this time. Thanks.

  • @coulterjb22
    @coulterjb22 Місяць тому

    The Performance audio track shows 'Unresolved Binding' though I can hear the audio. When I export the animation no sound is exported......the internet has failed me. Anyone? UE 5.3 and 5.4

  • @JoshuaLundquist
    @JoshuaLundquist 9 місяців тому

    Epic is great but they gotta give us some options with the markers, like maybe don't make the ones that go on teeth straight up yellow / green lol? Great video though.

  • @davidMRZ
    @davidMRZ 11 місяців тому +2

    So strange, in Capture source whatever I put( live link or archive) it doesn't find it. It gets green in Capture manager but nothing shows up🤔

    • @Jsfilmz
      @Jsfilmz  11 місяців тому +1

      hey man im uploading another tutorial stand by maybe u missed something

    • @davidMRZ
      @davidMRZ 11 місяців тому

      @@Jsfilmz great, thank you so much

  • @sahinerdem5496
    @sahinerdem5496 10 місяців тому

    unreal doesn't see my livelink face recorded file, any help

  • @sahinerdem5496
    @sahinerdem5496 11 місяців тому

    can i use android video record of my face?

  • @cubiclulz
    @cubiclulz 11 місяців тому +1

    can it be used not with my face but with the one created in metahuman creator? and how to do it?

    • @Jsfilmz
      @Jsfilmz  11 місяців тому

      stay tuned

  • @tomhalpin8
    @tomhalpin8 11 місяців тому +1

    Were you able to figure out how to connect the iPhone to wirelessly be triggered to record from Unreal?

    • @Jsfilmz
      @Jsfilmz  11 місяців тому

      just the regular livelink way?

    • @tomhalpin8
      @tomhalpin8 11 місяців тому

      @@Jsfilmz With the new MetaHuman Animator version of LiveLinkFace, can't seem to connect. Trying to match my body mocap with the face mocap.

    • @Jsfilmz
      @Jsfilmz  11 місяців тому

      cant do it live animator is offline

    • @tomhalpin8
      @tomhalpin8 11 місяців тому

      @@Jsfilmz I wonder if I can I can use OSC to sync up the recording process on the phone with the mocap

  • @mt_gox
    @mt_gox 7 місяців тому

    His next epic: "Get Shorty"

  • @jonos138
    @jonos138 3 місяці тому

    I have a problem that the character shows more bottom teeth than top. I dont speak like that andclive link doesnt do it.
    I tried re-tracking face markers but still had the same problem.

  • @Amelia_PC
    @Amelia_PC 11 місяців тому +1

    Could you provide me with the information about your iPhone 12 model so that I can search for a similar one to buy?

    • @Jsfilmz
      @Jsfilmz  11 місяців тому +1

      12 mini broski

    • @Amelia_PC
      @Amelia_PC 11 місяців тому

      @@Jsfilmz Thanks!

  • @nmatthes2927
    @nmatthes2927 11 місяців тому

    Do you have to record the videos with the live link app?
    Because for every android user this would be a real pain in the *ss and make this tool completely unusable and useless for non apple-device owners.
    I tried using a normal mp4 video but it said that there is no footage in the folder.

    • @Jsfilmz
      @Jsfilmz  11 місяців тому +1

      tell android to get it together i havr android main phone

  • @kameroco
    @kameroco 7 місяців тому

    I've done it but how can I render it... help me pls

  • @RikkTheGaijin
    @RikkTheGaijin 11 місяців тому +1

    is there a way to export the head movement? it looks unnatural to have just the facial expression with a still head.

    • @Jsfilmz
      @Jsfilmz  11 місяців тому

      yes watch my tripod tutorial i uploaded today

    • @RikkTheGaijin
      @RikkTheGaijin 11 місяців тому +1

      @@Jsfilmz it looks like that if you use a Metahuman that you already had before the latest update, it won't export the head movement, but if you download a Metahuman now, it will. I guess they have updated all metahumans in the Bridge catalog. I still don't know how to attach it to the body.

    • @RikkTheGaijin
      @RikkTheGaijin 11 місяців тому

      @@Jsfilmz I watched it but you are not showing how to attach the body, you just say "I will import a body animation"

    • @muviing5427
      @muviing5427 11 місяців тому +1

      Wow. I have the same issue. How to attach the body with proper movement...? I think there is the way to solve it with BP... I checked the Unreal forum but I can't find the proper way.

  • @artofjhill
    @artofjhill 11 місяців тому +1

    yeeeeaaaaaaa 🔥

    • @Jsfilmz
      @Jsfilmz  11 місяців тому

      hey J can i borrow 10k subs so i can hit 100k? thanks mang

  • @chipcode5538
    @chipcode5538 11 місяців тому +2

    On the metahuman video they use 4 calibration images, is there a reason why you did not use side views?

    • @Jsfilmz
      @Jsfilmz  11 місяців тому +2

      i have helmet on bro hahahaha they on a tripod if i move my head the camera wil move too

    • @chipcode5538
      @chipcode5538 11 місяців тому

      @@Jsfilmz You could make the calibration video with a tripod and use the helmet for the capture. I don’t know if it will improve the calibration, the metahuman seems a little off especially the nose.

  • @3rdDim3nsn3D
    @3rdDim3nsn3D 11 місяців тому +1

    Jo do you have any clue why my metahuman i created in creator does not show up in bridge? Its so frustrating man i spend over a hour now trying to get it to work
    I am mad as fu!& now😅

    • @Jsfilmz
      @Jsfilmz  11 місяців тому

      update bridge

  • @tessslaglory
    @tessslaglory 11 місяців тому

    Is it possible to record facial animation in 5.2, and then retarget it to 5.1 character?

    • @Jsfilmz
      @Jsfilmz  11 місяців тому

      cant go backwards far as i know

  • @thehotpotatosquad
    @thehotpotatosquad 11 місяців тому

    Hello, awesome tutorial, but I have a very basic problem, capture manager does not recognize the files, literally the step you make at 1:20 of your video shows no video files for me. I recorded with Iphone 13 mini (supports LiveLink, also realtime works), and I am working on Windows PC. Any ideas what can be the problem?

    • @Jsfilmz
      @Jsfilmz  11 місяців тому +1

      firewall?

    • @thehotpotatosquad
      @thehotpotatosquad 11 місяців тому

      @@Jsfilmz Im not so sure about that, because real-time connection works, just that imported video is not recognized at all. I must mention that I am using Windows PC, so might there be a problem while transferring files? I just imported from OneDrive the whole zipped folder and un unpacked, but is there maybe another way?

  • @juanleche5382
    @juanleche5382 2 місяці тому

    Idk what's going on its says that's its no longer available 😕 I been trying to this for weeks now