God bless you for walking through step by step so easy instead of obnoxiously saying tons of unfamiliar lingo as fast as you can like some other tutorials
This is a fantastic tutorial, thanks very much for this! I'd happened by your site before, but didn't fully appreciate the service you're offering until now. I'm looking forward to working with Polywink on custom characters soon.
Hello! Thanks for your feedback regarding the tutorial! Really appreciated. We'll be working on providing more soon! Have you checked our Studio Page for your custom characters? This page is only dedicated to custom request and we'll provide a quote adapted to your requirements. Here is the link for Polywink Studio : www.polywink.com/57-studio.html 😊 Send us an email if you have any questions, we'll be happy to help!
Rear mirror capturing possible? I want to use the rear camera to capture facial expressions correctly from say a youtube music video. I tried memoji with iPhone only front camera though, I tried flipping the laptop to face the front camera to record but it lags a lot and is inconvenient. Is there out there that allows rear camera capturing? So much more universal and useable say with friends capturing each other's funny moment rather just self.
Hello Katie! We absolutely get your point but as far as we know, there's no TrueDepth camera in the rear camera and this might explain why it is not possible to do video tracking with the rear camera.
It is an amzing tutorial . 15:39 in Make Rotator part. you connect the first float *float to Y(pitch). But the first float * float was come from HEADYAW which is the property Name of Get Property Value. So should it be connected with Z(Yaw), I cannot understand that part, could you please make it more clear?
Thank you soooo much followed your tutorial and was finally able to connect to the Live Link software. couldn't add textures tho but it works so I still consider this a Win!!! Thank you
I made better progress using using this livelink tutorial in 30 minutes than countless hours spend on other solutions using multiple plugins. Thanks! Is there a way to build out the project into an .exe file so that it is free of the UnrealEngine overhead?
Good till I got to the neck where there are no bones using the free template. I will have to start over with a new model that has actual bones in it. :/ I have created 2 bones "head and neck" and completed the tutorial and had to do minor changes in the blueprint for the yaw and roll. (they were reversed).
hello brilliant tutorial.I Am from India I am Having a problem here when i go to the phone put my ip address and leave the port default my phone name doesnt come in the live link drop down option.Because of that i can't connect.Any Solutions Pls help . I really want to do this
Thank you so much for this tutorial..I need to use this feature to use it on my full body character and render the scene and the final output is animation...how can I do that?Like how to turn is into animation sequence so I can add the face movements to my sequencer. thank you
Looks amazing! It is too complicated for me, may I receive live help how to install and connect everything? I would really aprecciate that! If yes- looking forward to purchase it!
Good morning Polywink. I am having trouble with the import. Everything works quite swimmingly until I import to Unreal. Settings seem to be fine, but I don't get a skeletal mesh, just the individual objects. I've tried it in UE5 and UE 4.26 but it comes in the same way. At first I thought I had downloaded a corrupted file, but a second download was the same. Obviously I'm doing something wrong. Is there anyone that could give me a quick hand with this? I would love to have fun with this...
Hello! I'm sorry we can't help you as we are missing information to help you. The problem can come from the blendshapes, the rig or the UE connections.
Amazing tutorial !! Thanks so much for sharing your knowledge!!!!! For a character with full-body, is there a way to have the upper body follow the head movement? It looks way to stiff when only the neck and above is moving :'(
Thanks so much for your feedback! Regarding your question, LiveLink only allow us to capture head and neck rotation. If you need the upper body to follow the head movement, you will have to do your own connection in Unreal Engine.
Are the values and ranges on the clamp/float nodes pretty universal for face/head rigging or specific to this model? Awesome tutorial btw it's cool to see a proper breakdown of nodes.
In this tutorial, the values are specific to Bella, it should already work with these values on your 3D model. If not, you can adjust them your model. Hope this helps and thanks very much for your feedback!
Hi, I wanted to know, if the realistic translation into the model is because of the number of blendshapes. Your site mentions more than 200 blendshapes. So i suppose more the number of targets, better will be the solve. thanks for the great tut.
nice and thank you BUT for all of us that we don't have iphone could not use it... so any alternative with android or web camera like FACEWARE studio but for free ? :D
Hello George! We know that the Android option is missing, we are trying our best to find an alternative but is not that easy. Be sure that we will communicate once we'd have found a solution! Thank you!
Hi! So this is pretty hecking badass. Do we have the ability to run it in the game preset for a new project? I'm way too new, so I don't know if the difference is major enough. And one last thing, would we be able to run this real time tracking method in a compiled .exe or will we have to continue using just the editor if we wanted to use this for like a very fancy virtual camera output for character acting?
The blend shapes, How does the app know which blend shape to use for which movement? Is it naming scheme? Does the number and naming of blend shapes have to match with metahuman blend shapes?
Hello Ayush, regarding the first part of your question, yes the blendshapes follow the naming scheme. As regarding the MetaHuman part we can't help you unfortunately!
Hello Justin Music, what do you mean by building a different face? Polywink is a website that offers rigs and blendshapes to animate your 3D character. We are not a character provider company (but it could be nice) :)
Hi McTittles, apologies for answering late. The website should be working by now, if not, please send us an email at contact@polywink.com thank you very much!
Hello Rajendra! We have named the service 'Animation for iPhone X' as the iPhone X was the first one to allow video tracking thanks to the TrueDepth camera. The iPhone 11 and 12 also has the TrueDepth camera so the ARkit will work with your devices :)
Is it possible to record the face using the live link app without actually having a connection with Unreal, then later import the lidar whatever video footage from my phone into UE?
Hi Daniel, thank you for your comment! We are not 100% sure about this answer but we don't think it is possible as the LiveLink mobile app only has a video recording and not any video import option.
Thanks for the video! I was trying to texture the sample character in Subtance Painter but: 1) I can't seem to paint the eyes - is this a glitch? 2) How do I import this back into Unreal Engine that it still works with the rigging etc? I hope someone can help :) Thank you
Maybe you should sell models in the future? I'm hesitant of paying $500 to give my own model blendshapes when it likely will not look nearly as high quality as the polywink models
What we do best for now is providing great tools to animate any 3D characters but why not offering models in the future, thanks for the tip Lae Mae! Polywink quality remains the same from one model to another. If you need information regarding how to upload your 3D model on Polywink to get the best results, here is the link for the tutorial : ua-cam.com/video/NAIRpsBfHRo/v-deo.html Also, feel free to send us an email if you have any questions! :)
From your video description, what is the charge to set up the 52 ARKit blendshapes? E.g. a Paragon Character, I believe they have the facial bones? But not the blendshapes. I hope I'm making sense to you.
Hi Tayam, can you please clarify what do you mean by "charge"? thank you! Regarding the Paragon character, we don't know if they use facial bones or blendshapes. Sorry for not being able to help you more!
@@Polywink Hey thanks for getting back to me. From your video description and the website, if I'm not mistaken, you provide a service to create/add the 52 custom ARkit blendshapes for custom 3D model? So I was wondering how much you'd charge for a Paragon character. And yes, I've checked, they have facial bones, but not blendshapes, so it's not possible to use Live Link on Paragon characters. Hope this clarifies.
Hello Daniel, we did this tutorial on PC but it should work on Mac as it is an Epic software. Please let us know if it doesn't and we'll try to figure this out!
hello! I have found this... so far not so good for the UE mac version : www.washingtonpost.com/video-games/2020/08/17/apple-cuts-off-epic-its-tools-endangering-future-unreal-engine-projects-ios-mac/
Hi Andrew, make sure, that you "transform(modify) bone" node has the rotation space is in bone space and don't forget to build up your scene before testing.You also need to check the input connection of your "Make rotator" nodes. If the problem persist, it might be due to the original orientation of your joints.
Hello! Thanks for your positive feedback! Yes, you can use the Animation for iPone X to create virtual productions in UE4. Send us a message at contact@polywink.com if you need more informations. Talk soon :)
Hello Duck Lord (love your name)! We do not produce 3D models but only create facial rigs tailored to your own 3D model. If you create or purchase a 3D model, we can check if it's compatible with our services. Send us an email at contact@polywink.com and a member of our team will get back to you within 48 hours.
Hi Dj Juanito, if you already have a body animation you can add the LiveLink facial mocap on top, this way you can have both animations playing simultaneously. This tutorial can help you for the facial animation however you might find other tutorials that will cover the full combination (unfortunately not on our UA-cam page yet!)
Ola Gabriel, in order to get your phone's name in the livelink subject name, you need to make sure that your phone is connected to the same wifi as your computer. You also need to make sure that you have put the exact same IP address of your computer in your live link app.
Hello Lemon, in this service (Animation for iPhone X) the eyeball rotation is in the blendshapes. It is included in this service but if you want to target the eye rotation in the bones, we can exclude the eyeball rotation from the blendshapes. I hope this helps, otherwise send us a message at contact@polywink.com 😊
Hello Karuthu, I think this will help you : docs.unrealengine.com/4.27/en-US/AnimatingObjects/Sequencer/HighQualityMediaExport/RenderSettings/OutputFormats/
It is compatible with the iPhoneX and higher version. Sorry you can't make it with your actual phone, maybe with the next one! Thank you for your kind word!
hi Gaming with Paribes, there's a LiveLink for Blender according to this page : www.unrealengine.com/marketplace/en-US/product/livelink-for-blender/questions and our Animation for iPhone X service have an FBX export. Is that what you meant? Thanks!
God bless you for walking through step by step so easy instead of obnoxiously saying tons of unfamiliar lingo as fast as you can like some other tutorials
Thank you Justin :D
Hands down the best beginner tutorial on this out there
Oh
A fellow Scot! That's still rare on UA-cam. Thanks for the great tutorial.
i know im going to sound cynical but
holy crap, a usefull ue4 tutorial!
pls more!
Thanks!!! We'll do our best to do more soon!
Anyone having problems, enable the following Plugins for your Project:
Live Link
ARKit
ARKit Face Support
Thanks for sharing Luccas ;)
This is a fantastic tutorial, thanks very much for this! I'd happened by your site before, but didn't fully appreciate the service you're offering until now. I'm looking forward to working with Polywink on custom characters soon.
Hello! Thanks for your feedback regarding the tutorial! Really appreciated. We'll be working on providing more soon!
Have you checked our Studio Page for your custom characters? This page is only dedicated to custom request and we'll provide a quote adapted to your requirements. Here is the link for Polywink Studio : www.polywink.com/57-studio.html 😊
Send us an email if you have any questions, we'll be happy to help!
@@Polywink Thanks for the heads up on the Studio Page! It was easy to navigate. Looking forward to discussing further.
would love love love to see more videos on this. especially more character options from the Polywink
I'm definitely subscribing to this channel because of this video... Amazing tutorial by the way, thanks for this.
Hey Matty, we love reading feedbacks like yours. Thanks very much! We will publish more soon.
Rear mirror capturing possible? I want to use the rear camera to capture facial expressions correctly from say a youtube music video. I tried memoji with iPhone only front camera though, I tried flipping the laptop to face the front camera to record but it lags a lot and is inconvenient. Is there out there that allows rear camera capturing? So much more universal and useable say with friends capturing each other's funny moment rather just self.
Hello Katie! We absolutely get your point but as far as we know, there's no TrueDepth camera in the rear camera and this might explain why it is not possible to do video tracking with the rear camera.
the best and the most thourough tutorial on this subject ! Great job !
Thank you!
you voice and mouthsound fits that character waaaay to good. cant even listen to what you are saying... 😂
It is an amzing tutorial . 15:39 in Make Rotator part. you connect the first float *float to Y(pitch). But the first float * float was come from HEADYAW which is the property Name of Get Property Value. So should it be connected with Z(Yaw), I cannot understand that part, could you please make it more clear?
This looks like Disney movie level 🤩👍👍👌👌
Thank you soooo much followed your tutorial and was finally able to connect to the Live Link software. couldn't add textures tho but it works so I still consider this a Win!!! Thank you
Thanks for your kind words! Really happy that it helped you.
Hands-down the best easy to understand tutorial out there :)
thank you very much Rupesh!
That Super Easy !!!!!!, thank you for making this
Thank you for your nice feedback!
Amazing tutorial, thank you so much for making this :)
And thank you so much for your kind feedback!
I made better progress using using this livelink tutorial in 30 minutes than countless hours spend on other solutions using multiple plugins. Thanks! Is there a way to build out the project into an .exe file so that it is free of the UnrealEngine overhead?
You can of course turn an UE project into an EXE., you are just going to want to save a file of your captured data stream.
Fantastic tutorial!
Glad you like it! Thanks for your feedback!
Good till I got to the neck where there are no bones using the free template. I will have to start over with a new model that has actual bones in it. :/ I have created 2 bones "head and neck" and completed the tutorial and had to do minor changes in the blueprint for the yaw and roll. (they were reversed).
Awesome tutorial and great results thanks!
Happy to know that it helped! :)
Thank you very much for such a detailed explanation, it is very helpful
Thanks Home Chien, we are more than happy to read that it was helpful for you!
hello brilliant tutorial.I Am from India I am Having a problem here when i go to the phone put my ip address and leave the port default my phone name doesnt come in the live link drop down option.Because of that i can't connect.Any Solutions Pls help . I really want to do this
Thank you so much for this tutorial..I need to use this feature to use it on my full body character and render the scene and the final output is animation...how can I do that?Like how to turn is into animation sequence so I can add the face movements to my sequencer. thank you
Looks amazing! It is too complicated for me, may I receive live help how to install and connect everything? I would really aprecciate that! If yes- looking forward to purchase it!
Hello Lumi, thanks for your message! We'll speak to you on Thursday :)
@@Polywink Thank you soooo much!
Good morning Polywink. I am having trouble with the import. Everything works quite swimmingly until I import to Unreal. Settings seem to be fine, but I don't get a skeletal mesh, just the individual objects. I've tried it in UE5 and UE 4.26 but it comes in the same way. At first I thought I had downloaded a corrupted file, but a second download was the same. Obviously I'm doing something wrong. Is there anyone that could give me a quick hand with this? I would love to have fun with this...
Hey! The head is moving in one direction , and the facial morphs are mirrored moving in opposite directon
Hello! I'm sorry we can't help you as we are missing information to help you. The problem can come from the blendshapes, the rig or the UE connections.
Great video, thanks.
My iPhone doesnt support that camera function. But sure UE is great. Nice video
Thanks Karel for your feedback!
@@Polywink I am just thinking about rigging in UE. For some animals and stzliyed characters. It seems to be fun with that mobile app.
Amazing tutorial !! Thanks so much for sharing your knowledge!!!!! For a character with full-body, is there a way to have the upper body follow the head movement? It looks way to stiff when only the neck and above is moving :'(
Thanks so much for your feedback! Regarding your question, LiveLink only allow us to capture head and neck rotation. If you need the upper body to follow the head movement, you will have to do your own connection in Unreal Engine.
Are the values and ranges on the clamp/float nodes pretty universal for face/head rigging or specific to this model? Awesome tutorial btw it's cool to see a proper breakdown of nodes.
In this tutorial, the values are specific to Bella, it should already work with these values on your 3D model. If not, you can adjust them your model. Hope this helps and thanks very much for your feedback!
Great post... thank you so much😎
Thanks for your kind words Dwayne!
Hi, I wanted to know, if the realistic translation into the model is because of the number of blendshapes. Your site mentions more than 200 blendshapes. So i suppose more the number of targets, better will be the solve. thanks for the great tut.
so cool~ still trying figure it out in ue4
nice and thank you BUT for all of us that we don't have iphone could not use it... so any alternative with android or web camera like FACEWARE studio but for free ? :D
Hello George! We know that the Android option is missing, we are trying our best to find an alternative but is not that easy. Be sure that we will communicate once we'd have found a solution! Thank you!
Never thought i say "I wish i had an iphone".
Hi! So this is pretty hecking badass.
Do we have the ability to run it in the game preset for a new project? I'm way too new, so I don't know if the difference is major enough.
And one last thing, would we be able to run this real time tracking method in a compiled .exe or will we have to continue using just the editor if we wanted to use this for like a very fancy virtual camera output for character acting?
The blend shapes, How does the app know which blend shape to use for which movement? Is it naming scheme? Does the number and naming of blend shapes have to match with metahuman blend shapes?
Hello Ayush, regarding the first part of your question, yes the blendshapes follow the naming scheme. As regarding the MetaHuman part we can't help you unfortunately!
Is it possible to use on live link connection to animate multiple faces?
Looks like it lacks a bit of “😮” expression? Is there a fix?
Hi Stinn, the blendshapes follow the ARKit convention (52 blendshapes) thus some deformations are not accessible with this format!
谢谢你的视频,我的面部捕捉是成功的。 但是脖子不动。 我不知道原因😊
How can i give eyeball color? it seems not work if i just change the color of the sh_eyeballs material。
I just wish there was a way to export the facial mocap data as a fbx.
Hi Pradip, even if LiveLink does not allow it, Facecap does: apps.apple.com/fr/app/face-cap-motion-capture/id1373155478
How do you build a different face than the one you downloaded? I don't see an easy option to do so from the Polywink website
Hello Justin Music, what do you mean by building a different face? Polywink is a website that offers rigs and blendshapes to animate your 3D character. We are not a character provider company (but it could be nice) :)
i had the phone set to the wrong wifi. i corrected it but its still not showing up
Check that you also have ARKit plugins enabled in UE4 as well
@@dialusdudas they were..the problem fixed itself with a restart
to make that mocap of the face, can it be done with an iphone 8? and is it necessary for the face id to work?
The website to download the model is down. Is there an alternative link?
Hi McTittles, apologies for answering late. The website should be working by now, if not, please send us an email at contact@polywink.com thank you very much!
Hey polywink I have iPhone 11 and 12 I wonder that will Apple Arkit is working for these devices
Hello Rajendra! We have named the service 'Animation for iPhone X' as the iPhone X was the first one to allow video tracking thanks to the TrueDepth camera. The iPhone 11 and 12 also has the TrueDepth camera so the ARkit will work with your devices :)
@@Polywink thanks for the information really excited to work with you ❤️❤️
Thanks can i use iphone xs which iphone is supporting this app??
Hello! Thanks for your feedback! The app works with iPhone X and upper version (it needs to have the TrueDepth camera).
Is it possible to record the face using the live link app without actually having a connection with Unreal, then later import the lidar whatever video footage from my phone into UE?
Hi Daniel, thank you for your comment! We are not 100% sure about this answer but we don't think it is possible as the LiveLink mobile app only has a video recording and not any video import option.
Same question! If you ever find out, please update...
Thanks for the video! I was trying to texture the sample character in Subtance Painter but:
1) I can't seem to paint the eyes - is this a glitch?
2) How do I import this back into Unreal Engine that it still works with the rigging etc?
I hope someone can help :)
Thank you
My phone is android. Does it work?
Maybe you should sell models in the future? I'm hesitant of paying $500 to give my own model blendshapes when it likely will not look nearly as high quality as the polywink models
What we do best for now is providing great tools to animate any 3D characters but why not offering models in the future, thanks for the tip Lae Mae! Polywink quality remains the same from one model to another. If you need information regarding how to upload your 3D model on Polywink to get the best results, here is the link for the tutorial : ua-cam.com/video/NAIRpsBfHRo/v-deo.html
Also, feel free to send us an email if you have any questions! :)
wow this is incredible and I would LOVE to use this service but 499 euros is a crazy amount for me ( a self taught indie dev).
Hey! Thanks for your message. We understand your point, have you seen that we offer a FBX format for 299euros?
From your video description, what is the charge to set up the 52 ARKit blendshapes? E.g. a Paragon Character, I believe they have the facial bones? But not the blendshapes. I hope I'm making sense to you.
Hi Tayam, can you please clarify what do you mean by "charge"? thank you! Regarding the Paragon character, we don't know if they use facial bones or blendshapes. Sorry for not being able to help you more!
@@Polywink Hey thanks for getting back to me. From your video description and the website, if I'm not mistaken, you provide a service to create/add the 52 custom ARkit blendshapes for custom 3D model? So I was wondering how much you'd charge for a Paragon character. And yes, I've checked, they have facial bones, but not blendshapes, so it's not possible to use Live Link on Paragon characters. Hope this clarifies.
This is great, but does it work on Mac? You repeatedly said "on your PC", which would suggest that maybe not?
Hello Daniel, we did this tutorial on PC but it should work on Mac as it is an Epic software. Please let us know if it doesn't and we'll try to figure this out!
but…but… is the Unreal part functional on the Mac version?
hello! I have found this... so far not so good for the UE mac version : www.washingtonpost.com/video-games/2020/08/17/apple-cuts-off-epic-its-tools-endangering-future-unreal-engine-projects-ios-mac/
Hi, my Roll and Yaw is swap around. Went through it 3x to double check and I'm not sure what to edit it. Any help would be great. TY
Hi Andrew, make sure, that you "transform(modify) bone" node has the rotation space is in bone space and don't forget to build up your scene before testing.You also need to check the input connection of your "Make rotator" nodes.
If the problem persist, it might be due to the original orientation of your joints.
This is an amazing video, hey I was wondering if you can use UE4 to create a virtual studio and transfer that over to stream labs as a scene?
Hello! Thanks for your positive feedback! Yes, you can use the Animation for iPone X to create virtual productions in UE4. Send us a message at contact@polywink.com if you need more informations. Talk soon :)
Hi there! Does your service produce a whole new model, ready to be animated? Or do we need our own model?
Hello Duck Lord (love your name)! We do not produce 3D models but only create facial rigs tailored to your own 3D model. If you create or purchase a 3D model, we can check if it's compatible with our services. Send us an email at contact@polywink.com and a member of our team will get back to you within 48 hours.
how can i use the skeleton and mesh for my own character?
Hello Adam! Unfortunately you can't use the skeleton and mesh for your own character as written in the licence.
Hello, question… How can I add facial mocap to a character that has existing animation (body animation with no facial animation)?
Hi Dj Juanito, if you already have a body animation you can add the LiveLink facial mocap on top, this way you can have both animations playing simultaneously. This tutorial can help you for the facial animation however you might find other tutorials that will cover the full combination (unfortunately not on our UA-cam page yet!)
What version of UE4 is this that you used?
Hi Roy! We used the version 4.26.1 but you can use any that support the LiveLink plug-in!
What if u dont have an iPhone does it work on android?
Hello Robert, unfortunately this only work with an iPhone as a mocap device! Have a nice holidays!
Can anyone help me to know how I make my model appear in the live link subject name?
Ola Gabriel, in order to get your phone's name in the livelink subject name, you need to make sure that your phone is connected to the same wifi as your computer. You also need to make sure that you have put the exact same IP address of your computer in your live link app.
Does the eyeball use bone rotation?
Hello Lemon, in this service (Animation for iPhone X) the eyeball rotation is in the blendshapes.
It is included in this service but if you want to target the eye rotation in the bones, we can exclude the eyeball rotation from the blendshapes. I hope this helps, otherwise send us a message at contact@polywink.com 😊
is there any possiblity for android?
Hello Mrs. İs this work any character
Hello! Yes, you can use this tutorial for any 3D characters that are previously rigged with the 52ARkit blendshapes.
hi may i know how to export it as video. It exports as series of JPG image file. I couldn't get video file
Hello Karuthu, I think this will help you : docs.unrealengine.com/4.27/en-US/AnimatingObjects/Sequencer/HighQualityMediaExport/RenderSettings/OutputFormats/
IPhone X will work?
Absolutely! This is also why we named the 52ARKit blendshapes Polywink service 'Animation for iPhoneX' :) Have fun!
this would be a great a tutorial but i have a iphone 8 so not compatitable great tutorial though
It is compatible with the iPhoneX and higher version. Sorry you can't make it with your actual phone, maybe with the next one! Thank you for your kind word!
@@Polywink yes its a shame!! great video though looks really intersting
So how to record this stuff?
how can i create my 3D model in unreal? capture myself by lidar camera?
Hello Tse Alex, we don't offer this kind of tutorial yet but I'm sure you can find a LOT on UA-cam. Good luck for your research!
iPhone will heat up use iPad instead if you have one.
Thanks Dhruv Joshi for sharing!
u should've mentioned iPhone only in the title 😔
This is a great point indeed Optimus, thanks for sharing it with us, we'll do the necessary to change this!
so anyone can do it. show on android
why do u stop doing videos man :c
Lack of time :( but hopefully we will launch new tutorials this year!
Going to be purchasing this is a few months when its time to develop all these fucking cutscenes. Is this the easiest way to get facial animation?
The ARKit is indeed the easiest way to get facial mocap!
Love it but it doesn't work to me Idk why :(
Hey Beatriz, we are sorry to hear that! hope that you will find a solution :)
Is Live Link possible for Android?
Hi Shayden, unfortunately it seems to be a non-android app for the moment! :(
What about audio ?
LiveLink only record facial datas, You can use another canal for your audio!
Uhhhhh........🤔🤔🤔🤔
What if you dont have an iPhone? 😅
I phones are still expensive.
Hey Dude Slick, indeed there are still expensive! But still cheaper than other mocap devices, don't you think ?
Iphone its self is expensive so i will pass and stick to android techniques 🙃
Okay :) All the best for your future projects!
iPhone is out of date, I'm waiting for Android.
Okay, let's wait until this is available with Android! :)
Eyes = dead
Hello La Sal, the eyes are not dead just not textured! Are you using the sample? Thanks!
@@Polywink As in uncanny valley. Notice the difference between this and Dreamworks or Pixar eyes.
Amazing tutorial, thank you so much for making this :)
thank you for sharing your feedback and very happy that it helped you! :)
Is it possible to use this recorded blend shape data in Blender, and if so, how?
is it possible to use the live face app with the samsung galaxy S20 ? it's only 499 $
Are there any addons for blender? Live link is the only tool i found that was free but there doesnt seem a way to export your model as an fbx
hi Gaming with Paribes, there's a LiveLink for Blender according to this page : www.unrealengine.com/marketplace/en-US/product/livelink-for-blender/questions and our Animation for iPhone X service have an FBX export. Is that what you meant? Thanks!
@@Polywink Well i tried exporting but it exported as a csv