Hello, I wanted to ask you something. My phone doesn’t connect with UE5, but I need that in order to get Live Link working. Do you have a piece of advice for me? I already checked the WiFi but it is the same WiFi like on my iPhone. Please help!
have you found a way to rig an IK for a character to match the metahuman or default mannequin while retaining live link face functionality on a custom model?
Amazing! Instead to use Take Record to record the facial capture, you can right click the Face track and choose "Bake animation sequence" to save the facial capture as an animation asset.
@@Jsfilmz well that step was missing in your video tutorial, I just think someone could be interested. Sorry if seems I was talking the obvious. Peace!
Hi there! I've a problem... The facial animation works great in timeline when scrubbing... but if I add camera ad Camera Cuts I cannot export animation rendering... the metahuman face still without facial moviment! Have you got solution? Thanks
@@michelesaporito7035 I wonder if you can bake it to the facerig after you get the clip in the timeline? Right Click the "face" labled portion of the Skeleton under the character and then you make see a bake anim to menu item. Then you could select the face anim panel rig and it should transfer the anim to the facepanel controls it is driving as keyframe data.
Amazing! I can feel the vibe.. really great addition to UE5. I wonder if it can also take- in audio for speech at the same time, that would be a great package for facial animation workflow.
So cool, man! Do you have a full walkthrough of the process? 3D model creation > facial animation applied > exporting the full animation to third-party software as OBJs or FBXs? Thanks!
What about the audio? This is a game changer for me. Now we can get VO talent to download the app and record themselves, then share the files. That’s a lot less production, better result.
Pretty sure none of the VP tools require simulation or play mode. They should all work in editor. Try checking if 'Update Animation in Editor' property on the skeletal mesh is turned on. That's what enables any AnimBP data or livelink data to show up in Editor.
Very nice work! I tried to ask AI if this was possible (offline) and it failed miserably. AI is too much A and not enough I. Hahaha! Humans rock! Especially this human (in the video). All the other tutorials took way longer to get to the point, but still get points for trying. Keep this up. We dig you. ❤
@@Jsfilmz had a few heart attacks and surgeries just made it back home... Got into the avalanche beta so bout to dig into that... no rest for the wicked
Never used either before, what is the advantage and disadvantage to using it with wifi (same network) or the method you are showing now? Is it only that you dont need internet?
On the back of this If you try to render on path tracing mode I found that the imported keys alone won't work even if playing when running the sequencer. This gets resolved by baking the animation into a clip from the face clip in the sequencer. This will probably make you life easier through out the importing process. Happy to elaborate further though other channels.
@@Jsfilmz jeah man, sometimes you have to bite into the "make it work" kind of thing. in the end , if you dont give up, its working and the whole thing comes together. no matter if its blueprint stuff, or even tweak an animation till it looks good. gratulatione, that you figgert that out, cause there is no real documentation. its like package a game or project to android.try and error, sometimes! love ya channel, you are trying cool stuff , mate. respect from northern alps, bavaria.
Very Awesome! When I did it though my chararacter had the upper lip issue :D not sure if its 5.1 or the importer method but sorted it out, I like this method of not having to live link to record the mocap :) Thank you
Hello, thank you for the video. But I wonder why my ARkit is active but my ARkit face subject there's no iphone detected. Do you have solution? Thank you
Can I other Video import to my Apps? I mean video that other person recording by livelinkapp can be imported to my LiveLink apps. I just have only mov files..
Hi JS, thank you for your comprehensive tutorials. I have a specific question. I have a mocap suit and a Custom 3dCharacter in my scene. I want to recrd my mocap body and import it in unreal. And then I wish to retarget my face capture data as fbx into unreal and retarget it to my character.. Can you say how I may do this.?
Hi there! I've a problem... The facial animation works great in timeline when scrubbing... but if I add camera ad Camera Cuts I cannot export animation rendering... the metahuman face still without facial moviment! Have you got solution? Thanks
This is amazing, It works! Btw do you know "how to set unreal to stop gpu when in background" - When i watch youtube, i want my Unreal Project to stop using GPU, because the PC is so hot. Thank you
@@Jsfilmz I find out the answer that type in cmd: t.MaxFPS 1 (which mean 1 fps). It's solve my problem. But it's not the convenience way! If you have another way, please rely.
Hello JSFILMZ . I watch a lot of your informative videos they are great . But unfortunately I tried this method a few times it does not work for me . Each time I import the CSV file into Unreal the sequence time range is shorter than the original video in my Iphone . The time range is shorter than the video . Is this bug in my Iphone x or maybe unreal ?
Thank you so much!! I really need this! But I can't find raw file in my ipad. There are only .csv, thumbnail, video, take.json these 4 files. Do you know how to set to save raw file?
did you figure out anyway of editing the animation after hooking it up in a sequence... for example when you are doing live capture using the iphone, you can edit the graph and over crank face shapes (like jaw/lip closing etc) working on something that was captured on set, so i have a face live recording, but when applied to the metahuman, the mouth is a bit slack, wondered if you knew of anyway of modifying the data stream to add animation/adjustments ON TOP of the recorded data? thanks for all your vids by the way, really elping me get to grips with unreal/metahuman!
You have the Justin Timberlake' voice bro ! excellent ! By the way, do you have a video about scanning your own face and import it in Unreal ? thanks...
This is another awesome one!! Everything works up until I get to the drop down. I only see my phone and not the added animation. I noticed it already said your animation was in the live link. Did it automatically show up for you or did you have to do anything extra to get it to show up in that list?
Can you do a render with this import Live Link feature? Sometimes it works for me using Render Queue and then othertimes it doesn't track the animations in the render. Love the channel!
JDog, I did everything that you presented in this video step-by-step. I couldn’t get it to work. No animation on my character’s face at all. I have been trying very hard to get metahuman animator to work with no luck. Then, I tried the live link face app, and couldn’t get it to recognize my iPhone. I went to the drop-down menu and the ARKit face subject and couldn’t get anything in the drop-down menu. Then, I followed this video and found my take (which I transferred to my Google drive from my iPhone, And still no luck. Is there anyway you can help me identify what I might be doing wrong? Thank you so much. As always, J.P.
Very cool, we had a similar issue where sub clips weren't getting created but the LiveLink data was there. Did you then manage to re-recorder in Sequencer and get the facial animation onto the MH Face controls?
Hello friend, do you have a tutorial to use the Live Link and control the scene with the cell phone and record a video? I see that it exists for iphone, but I don't know if it exists for android.
have you tried it now my live link face app is creating a frame log csv file instead of raw csv and frame log is considered as a data table file in unreal instead of level sequence please help
Tha is again for another good one brutha. Can you send me a link to 5.1, I am having a hard time finding it. I wanna try the decal joint out. I watched you video on it the other day, and it’s exactly what Im needing right now.
@@Jsfilmz that’s not a problem, I just can’t find it. Email me. Would love to talk about some ideas with you. Daniel@madmixedmedia.org thanks again brutha. Keep up the dope work.
Hey thanks a lot! Little issue: when I drag&drop my csv file I get a DATA TABLE OPTIONS folder and I can't see any option to import a Level Sequence (in your case it happened automatically), so I'm stuck :) do you know how to avoid that?
@@Jsfilmz Thanks for the reply. I followed all the instructions, I am using UE 5.1 and I've activated the LiveLinkFaceImporter plugin. I still see the DataTable Option window when I import the csv file and I don't get a Level Sequence. Maybe it's because I'm using a Mac, can't say :) Thanks again for the video, anyway!
Solved: on Windows it works perfectly but on Mac it doesn't! Now, for some reason, I just don't have the "LLink Face Head" option, but that's another problem :)
Hey J, I have a question for you, How can I have multiple MetaHumans in one Level Sequence using this system? I tried importing one MetaHuman with another but I crashed the session. Any thoughts?
I was able to bake the animation to the face skeleton as keyframes but this doesn't keep the head rotation. I think the head rotation is on the body skeleton instead of the face. Is there a way to get this head rotation from the CSV file baked to the head movement keyframes only? Thanks for these vids. Very helpful.
I tried copy/pasting the keyframes from the iPhone head.ctrl yaw, pitch, roll one lane at a time and it works to get the keyframes to/fro but it is very fiddly. Sadly these values move the head very little as if there is some interpolation between the actual numbers and what is driving the livelink face app values. We need a way to get this data to the neck without a lot of science project time. There should be a documented way to get the head rotation from the livelink face app onto the character. I notice this question is asked in a lot of places.
@@jayarajvin3404 I think I did find some post that had a solution but it was convoluted. I think Unreal could really make this process far more streamlined. There are too many gotchas and extra things to fiddle around with to really do this easily. There are too many cases where someone will want to save on GPU cycles and not want to capture a performance live and instead do it on their phone and have the hi-res capture successfully transferred with little hassle. I know they could make this far more plug_n_play if they just standardized the file format so that it always has the head rotation data as part of the way it hooks up then maybe add in options to disable the head rotation or any layer for that matter for those that don't want it.
@@caseycbenn Actually I got the Head rotations and face animation in the Sequencer & it plays just fine. But when I export, head rotations are not exporting, this the problem man, do you have any ideas why or what causing it not to export but plays fine only in the sequencer?? btw thanks for responding, much appreciated!
Is anybody else having the issue where their Metahuman just stares blankly forward? All other animations are there, but the eye animations are not working.
Hey Jae, im trying to get this to work in 5.1 but it’s not connecting. When I drag the CSV file into the project and open the sequencer, I go to select the sequence from the ARKit face subj drop down but it’s no longer showing up there. The name of my phone shows up but not the sequence name(i.e csv file name). Would appreciate any help you can provide to troubleshoot.
@@Jsfilmz lol 😂 you know what my bad because I realize it is thank when I did it I didn’t had to simulate I just press the play button and it work, with this who needs iclone this is even better than iclone live link
Currently 5.1 asks you what file type the .CSV should be imported as - DataTable, FloatTable, FloatCurve etc. Then there are subcategories to choose from there. I'm going through each and not getting an anim file. Any suggestions?
Very cool...I think....what I mean, and forgive me if I'm asking a stupid question, I don't have a smartphone; a cell phone yes, but it's text and talk only....NO DATA service (I don't really need a service other than T and T, I'm at home on my computer all day so I don't see a need for one) So if I purchase an iPhone, can I just use it as a Wi-Fi phone (to get the app) and not have a phone service? If I don't need a service then YES this will be exactly what I need!
It WORKS!!!!!!
Hello, I wanted to ask you something. My phone doesn’t connect with UE5, but I need that in order to get Live Link working. Do you have a piece of advice for me? I already checked the WiFi but it is the same WiFi like on my iPhone. Please help!
have you found a way to rig an IK for a character to match the metahuman or default mannequin while retaining live link face functionality on a custom model?
Amazing! Instead to use Take Record to record the facial capture, you can right click the Face track and choose "Bake animation sequence" to save the facial capture as an animation asset.
yes i know ive been using metas since prerelease mate
@@Jsfilmz well that step was missing in your video tutorial, I just think someone could be interested. Sorry if seems I was talking the obvious. Peace!
Hi there! I've a problem... The facial animation works great in timeline when scrubbing... but if I add camera ad Camera Cuts I cannot export animation rendering... the metahuman face still without facial moviment! Have you got solution? Thanks
@@michelesaporito7035 I wonder if you can bake it to the facerig after you get the clip in the timeline? Right Click the "face" labled portion of the Skeleton under the character and then you make see a bake anim to menu item. Then you could select the face anim panel rig and it should transfer the anim to the facepanel controls it is driving as keyframe data.
@@michelesaporito7035 I have the same issue
wow this is a massive improvement for any solo dev
Man!!! This is huge! Thanks a lot for helping us figure it out.
Best soft soft Introduction Ever!!
Amazing! I can feel the vibe.. really great addition to UE5. I wonder if it can also take- in audio for speech at the same time, that would be a great package for facial animation workflow.
the app records a video as well along with the take
Christmas comes early!
Like you said, probably something simple.
Thanks for this!
hahaha yea i was overthinking it
So cool, man!
Do you have a full walkthrough of the process?
3D model creation > facial animation applied > exporting the full animation to third-party software as OBJs or FBXs?
Thanks!
Need this!!
Thanks man for the update! - really appreciate the tips as I'm going through the same problems! will try this out later tonight
Guys it really works, I checked.
I'm having trouble getting the Take show up in the Face Subject. Only my phone shows up. Any advice?
This is absolutely amazing keep going!
What about the audio? This is a game changer for me. Now we can get VO talent to download the app and record themselves, then share the files. That’s a lot less production, better result.
yup exactly!!!!! sync audio in post bro work with anyone around the world with an iphone
@@Jsfilmz I there any way to face mocap with an android phone?
Great video. A tutorial on clean up would be really helpful.
Thanks, great video, well laid out and easy to follow :)
Thanx man, this is really awesome!
Pretty sure none of the VP tools require simulation or play mode. They should all work in editor. Try checking if 'Update Animation in Editor' property on the skeletal mesh is turned on. That's what enables any AnimBP data or livelink data to show up in Editor.
i tried enabling it no dice like i said in the video
Very nice work! I tried to ask AI if this was possible (offline) and it failed miserably. AI is too much A and not enough I. Hahaha! Humans rock! Especially this human (in the video). All the other tutorials took way longer to get to the point, but still get points for trying. Keep this up. We dig you. ❤
nice! thank you for sharing looking forward to 5.1!
Hey man. What do I choose in the data table row options when importing the csv? WOn't let me proceed without choosing an option
This is a game changer my guy. Subscribed and liked. Your content is super helpful.
Welcome aboard!
Very useful as usual :)
Nooice!
i forgot how to do this shit.... this video saved my ass
hahahaha welcome back mang hope all is good wit ya
@@Jsfilmz had a few heart attacks and surgeries just made it back home... Got into the avalanche beta so bout to dig into that... no rest for the wicked
Great tutorial, although you actually need to choose the CSV file that ends with "cal", as the "raw" one is uncalibrated.
yup i made update to this on my course :)
Yeah man, you got it👍 now hopefully iphone wont have heating issue by doing this method 😉
yeaaa dude it might still do i might upgrade to iphone 12 pro
@@Jsfilmz heating probably cause by live data transfer but i might still be wrong.
@@limitlessinmind yea was thinkin the same
Never used either before, what is the advantage and disadvantage to using it with wifi (same network) or the method you are showing now? Is it only that you dont need internet?
exactly meaning i can record offsite
This is cool even though I'm not a creator, it's nice...very nice😆🙌
Well done! You're da man!
You da woman!
On the back of this If you try to render on path tracing mode I found that the imported keys alone won't work even if playing when running the sequencer. This gets resolved by baking the animation into a clip from the face clip in the sequencer. This will probably make you life easier through out the importing process. Happy to elaborate further though other channels.
Yes, cool, it works! Thank you
Great stuff bro
thanks for trying out buddy
You got it man you saw it here first whoop whoop
@@Jsfilmz jeah man, sometimes you have to bite into the "make it work" kind of thing.
in the end , if you dont give up, its working and the whole thing comes together.
no matter if its blueprint stuff, or even tweak an animation till it looks good.
gratulatione, that you figgert that out, cause there is no real documentation.
its like package a game or project to android.try and error, sometimes!
love ya channel, you are trying cool stuff , mate.
respect from northern alps, bavaria.
@@airkidproductions2848 loved and miss bavaria mate!
@@Jsfilmz when you are here next time, you will be welcome, you know you just have to send a message and you get your bed in my house!
Você simplesmente salvou a minha vida. Obrigado por essa dica!
Awesomesauce !!!
Thank you 👏🏽
hi i really like your video, i have a question i have an iphone x, if i buy an iphone 12 will the motion capture performance be better?
worked thanks man
Thanks man this is huge this is huge!!, but How to do this with ready player me avatar??
Very Awesome! When I did it though my chararacter had the upper lip issue :D not sure if its 5.1 or the importer method but sorted it out, I like this method of not having to live link to record the mocap :) Thank you
Glad it helped!
I love u bro thank youu
Bro I’ve gotta question, for example I turned on some plugins and how to check what plugins I turned on in UE5 ? Would be perfect to know about it ?
If we render with the MRQ would the facial animation show in the rendered frames? Hoping simulate isn't just an in-game solution.
yes also u can record with take recorder if u want
I have followed all the steps but somehow it’s not allowing to run the CSV file
Can anyone here sort this matter out?
I understood how to bake face animation, but how to bake head (neck) movement?
Hello, thank you for the video. But I wonder why my ARkit is active but my ARkit face subject there's no iphone detected. Do you have solution? Thank you
Thanks for sharing!!!
No problem 😊
Can I other Video import to my Apps? I mean video that other person recording by livelinkapp can be imported to my LiveLink apps. I just have only mov files..
Great hit buddy.. Salut
Hi JS, thank you for your comprehensive tutorials. I have a specific question. I have a mocap suit and a Custom 3dCharacter in my scene. I want to recrd my mocap body and import it in unreal. And then I wish to retarget my face capture data as fbx into unreal and retarget it to my character.. Can you say how I may do this.?
Very very cool!
This is perfect.
indded
Hi there! I've a problem... The facial animation works great in timeline when scrubbing... but if I add camera ad Camera Cuts I cannot export animation rendering... the metahuman face still without facial moviment! Have you got solution? Thanks
Same issue
Thanks for vid, liked and subscribed. On mine, there is no csv file in the folder, i have 4 files. Missing raw and csv for every take i do. Any ideas?
Did you read my comment on last video? You can save the data as animation sequence too
wym like take recorder? yea u pretty much same as online u can record the animation as well
This is amazing, It works! Btw do you know "how to set unreal to stop gpu when in background" - When i watch youtube, i want my Unreal Project to stop using GPU, because the PC is so hot. Thank you
no i think it only has cpu
@@Jsfilmz I find out the answer that type in cmd: t.MaxFPS 1 (which mean 1 fps). It's solve my problem. But it's not the convenience way! If you have another way, please rely.
Would it work with marketplace characters with blendshapes? Or is it strictly Metahumans?
Hello JSFILMZ . I watch a lot of your informative videos they are great . But unfortunately I tried this method a few times it does not work for me . Each time I import the CSV file into Unreal the sequence time range is shorter than the original video in my Iphone . The time range is shorter than the video . Is this bug in my Iphone x or maybe unreal ?
Thank you so much!! I really need this! But I can't find raw file in my ipad. There are only .csv, thumbnail, video, take.json these 4 files. Do you know how to set to save raw file?
So how much does it cost to have a studio for mocap stuff since you own one already
not sure problem is u prolly wont have customers depending on your location its stil fairly new to a lot of people
Hi! I have a question: How do I record the simulation? Is it possible to use keyframes in the sequencer?
did you figure out anyway of editing the animation after hooking it up in a sequence... for example when you are doing live capture using the iphone, you can edit the graph and over crank face shapes (like jaw/lip closing etc)
working on something that was captured on set, so i have a face live recording, but when applied to the metahuman, the mouth is a bit slack, wondered if you knew of anyway of modifying the data stream to add animation/adjustments ON TOP of the recorded data?
thanks for all your vids by the way, really elping me get to grips with unreal/metahuman!
Hi thanks for sharing, looks amazing. When will this plugin be available for anyone to download? Is it from the Epic Unreal Marketplace?
You have the Justin Timberlake' voice bro ! excellent ! By the way, do you have a video about scanning your own face and import it in Unreal ? thanks...
old one ua-cam.com/video/rwUTuy9v4Wk/v-deo.html
HI JSFILMZ , Great Tutorial👍👍 , Greetings rosuckmedia
thanks rosuck! glad to see u here
Well done for working it out!
Can’t find the “raw” file in my folder? Is there a setting to enable?
This is another awesome one!! Everything works up until I get to the drop down. I only see my phone and not the added animation. I noticed it already said your animation was in the live link. Did it automatically show up for you or did you have to do anything extra to get it to show up in that list?
Can you do a render with this import Live Link feature? Sometimes it works for me using Render Queue and then othertimes it doesn't track the animations in the render. Love the channel!
yes I saw this amazing course on artstation about this if you want to check it out its gotten pretty good reviews www.artstation.com/a/22299532
JDog, I did everything that you presented in this video step-by-step. I couldn’t get it to work. No animation on my character’s face at all. I have been trying very hard to get metahuman animator to work with no luck. Then, I tried the live link face app, and couldn’t get it to recognize my iPhone. I went to the drop-down menu and the ARKit face subject and couldn’t get anything in the drop-down menu. Then, I followed this video and found my take (which I transferred to my Google drive from my iPhone, And still no luck. Is there anyway you can help me identify what I might be doing wrong? Thank you so much. As always, J.P.
Very cool, we had a similar issue where sub clips weren't getting created but the LiveLink data was there. Did you then manage to re-recorder in Sequencer and get the facial animation onto the MH Face controls?
i havent yet but it should work
Hello friend, do you have a tutorial to use the Live Link and control the scene with the cell phone and record a video? I see that it exists for iphone, but I don't know if it exists for android.
nah android apk is behind man they lazy
have you tried it now my live link face app is creating a frame log csv file instead of raw csv and frame log is considered as a data table file in unreal instead of level sequence please help
Thanks, that works in android?
no cos android dont innovate
Tha is again for another good one brutha. Can you send me a link to 5.1, I am having a hard time finding it. I wanna try the decal joint out. I watched you video on it the other day, and it’s exactly what Im needing right now.
unfortunately you will have to compile it from scratch but hopefully 5.1 will come out soon
@@Jsfilmz that’s not a problem, I just can’t find it. Email me. Would love to talk about some ideas with you. Daniel@madmixedmedia.org thanks again brutha. Keep up the dope work.
For some reason the CSV file doesn't contain any shape key data, or at least the sequencer doesn't display any, what could be wrong?
please can you any share alternative application for android.
Broo.. I got the animation in editor.. but Head rotations are not exporting.. What can I do ?
Hey thanks a lot!
Little issue: when I drag&drop my csv file I get a DATA TABLE OPTIONS folder and I can't see any option to import a Level Sequence (in your case it happened automatically), so I'm stuck :) do you know how to avoid that?
you gotta follow the entire tutorial broski this is unreal engine 5.1
@@Jsfilmz Thanks for the reply. I followed all the instructions, I am using UE 5.1 and I've activated the LiveLinkFaceImporter plugin. I still see the DataTable Option window when I import the csv file and I don't get a Level Sequence. Maybe it's because I'm using a Mac, can't say :) Thanks again for the video, anyway!
Solved: on Windows it works perfectly but on Mac it doesn't! Now, for some reason, I just don't have the "LLink Face Head" option, but that's another problem :)
I get the same datatable menu also in Window after the latest 5.1 update...
@@darioomizzolo8532 Hey Dario! Ah that sucks - same problem here. Would you please let me know if you manage to figure out a solution?!
can an i pad be used for this or for live link in general?
long as it has truedepth
Hey J, I have a question for you, How can I have multiple MetaHumans in one Level Sequence using this system? I tried importing one MetaHuman with another but I crashed the session. Any thoughts?
Is this a free plugin? As it’s not free from iclone
yea free
for some reason it is not finding me iphone in the LLink Face Sibj tab... any ideas ?
so cool
agreed
Is this better over your helmet capture?
yea metahuman animator is nuts
Damn
Hey anyone getting a weird error where not the entire animation is imported to the level sequence? Im getting the first 10 or so seconds then nothing.
"inconvenient' would describe your previous comnt better then
I was able to bake the animation to the face skeleton as keyframes but this doesn't keep the head rotation. I think the head rotation is on the body skeleton instead of the face. Is there a way to get this head rotation from the CSV file baked to the head movement keyframes only? Thanks for these vids. Very helpful.
I tried copy/pasting the keyframes from the iPhone head.ctrl yaw, pitch, roll one lane at a time and it works to get the keyframes to/fro but it is very fiddly. Sadly these values move the head very little as if there is some interpolation between the actual numbers and what is driving the livelink face app values. We need a way to get this data to the neck without a lot of science project time. There should be a documented way to get the head rotation from the livelink face app onto the character. I notice this question is asked in a lot of places.
Same prob man.. Did you find any solutions 😢
@@jayarajvin3404 I think I did find some post that had a solution but it was convoluted. I think Unreal could really make this process far more streamlined. There are too many gotchas and extra things to fiddle around with to really do this easily. There are too many cases where someone will want to save on GPU cycles and not want to capture a performance live and instead do it on their phone and have the hi-res capture successfully transferred with little hassle. I know they could make this far more plug_n_play if they just standardized the file format so that it always has the head rotation data as part of the way it hooks up then maybe add in options to disable the head rotation or any layer for that matter for those that don't want it.
@@caseycbenn Actually I got the Head rotations and face animation in the Sequencer & it plays just fine. But when I export, head rotations are not exporting, this the problem man, do you have any ideas why or what causing it not to export but plays fine only in the sequencer?? btw thanks for responding, much appreciated!
Is anybody else having the issue where their Metahuman just stares blankly forward? All other animations are there, but the eye animations are not working.
Hey Jae, im trying to get this to work in 5.1 but it’s not connecting. When I drag the CSV file into the project and open the sequencer, I go to select the sequence from the ARKit face subj drop down but it’s no longer showing up there. The name of my phone shows up but not the sequence name(i.e csv file name). Would appreciate any help you can provide to troubleshoot.
Same problem. I had this all working in 5.0.3 in general but nothing seems to work in 5.1
Waiting on the tutorial on how to do it
this is a tutorial broski
@@Jsfilmz lol 😂 you know what my bad because I realize it is thank when I did it I didn’t had to simulate I just press the play button and it work, with this who needs iclone this is even better than iclone live link
I looked it up on Marketplace in Epic Games and I couldn't find it. What's the name of the plugin again?
its built in 5.1 dudes
Okay Thank You!
@@Jsfilmz Can this be done with an Android?
So wait how did you get 5.1 don't even see it for the update
github broski
@@Jsfilmz github broski what is that?
@@REALVIBESTV ceo of unreal engine, his name is Github Broski, pronounced "Gee Thub Brow Skee"
@@pstwr 😂
How did you accessed UE 5.1
Currently 5.1 asks you what file type the .CSV should be imported as - DataTable, FloatTable, FloatCurve etc. Then there are subcategories to choose from there.
I'm going through each and not getting an anim file. Any suggestions?
u need to enable the plugin
@@Jsfilmz Thanks for your reply. LiveLinkFaceImporter is enabled in Plugins and the project restarted, still same issue. And I'm in 5.1.
Any ideas?
Maybe the plugin only works on Windows ..
Yup. Only works on Windows. FYI
Very cool...I think....what I mean, and forgive me if I'm asking a stupid question, I don't have a smartphone; a cell phone yes, but it's text and talk only....NO DATA service (I don't really need a service other than T and T, I'm at home on my computer all day so I don't see a need for one) So if I purchase an iPhone, can I just use it as a Wi-Fi phone (to get the app) and not have a phone service?
If I don't need a service then YES this will be exactly what I need!
yes exactly. just wifi to download the app and thats it
@@Jsfilmz Ok GREAT.....guess I better save up to get one one of these days...soon I hope! 😉
i phone x are cheap now mate
@@Jsfilmz Great thanks for the tip...I'll look into it soon...THANKS bro!
where is the plug in? thanks
waiting for the day they will release the live link for Android
bro android effin sleepin ive been long android user
Us
can you plz share yr csv file to test it
I get the animation but not the voice part
I thought a bout this and I came to the same solution!