hello, first of all thanks for the video. selected as the browdown of blendshape at 10:48 in the video. But in my unity screen I have no browdown option, instead only mouthopen and mouthsmile. The character's face turns left and right, but his mouth, eyebrows and eyes remain fixed. How can I solve it?
Since than we changed the the system a lot, this tutorial become somewhat outdated. You need to download an avatr with avatarconfig that includes ARKit blendshapes. docs.readyplayer.me/ready-player-me/integration-guides/unity/optimize/avatar-configuration
@@INGame. You need to use Apple guide for ARKit blendershapes and add them all one by one. Or you might wanna use a tool that helps with generating the blendshapes.
This is the best I've seen and I really appreciate you taking the time. You said at the end that you were planning a tutorial for demonstrating the face capture with a second layer combining Mixamo animations with the custom face tracking animations. That is what I need desperately. Have you done it? Is there a video you can recommend for the layer process if you're not going to be able to do it soon? This is an urgent need. I appreciate your help!
When I put my avatar in the Rig Prefab item at the face mapper stage, this warning message appears. The rig’s SkinnedMeshRenderer components have not been assigned meshes with blend shapes, so blend shape mapping cannot be used
And when I create and import a character from ready player me following your map, unlike you, I look at the avatar's Inspector and it's named 'Renderer_avatar, not Wolf 3D Avatar.
@@Robloxzz this tutorial became a bit outdated on RPM side, you will need to download an avatar with Avatar Config in Unity docs.readyplayer.me/ready-player-me/integration-guides/unity/optimize/avatar-configuration
Hi Sercan, new to the latest facial recognition in Unity. I had to remove the Newton Soft Json plugin when importing the ready player me sdk for unity package, otherwise it gave me an error of having two of the same plugins. Don't know if that will happen for others. Excellent video. Took me an afternoon to get things running.
Thank you. Yes this is a known issue due to Version Control package having the Newtonsoft library as a dependency. You can find other known issues here: docs.readyplayer.me/integration-guides/unity/troubleshooting
would love a reply even though I'm somewhat late to the party. I'm having trouble adding the renderers at 10:30. All it shows is empty. And it tells me that my Ready Player Me model doesn't have blend shapes
@@sgt3v it was from readyplayerme. I figured it out and I’ll explain for others if they have the problem. You have to open your prefab in prefab mode and set the meshes manually. It didn’t make sense why it fixed it but it did.
@@twhippp hii, i am still working on this but i don't understand what you mean with prefab mode changing the meshes. the only mesh i found so far was the mesh of the skinnedmesh avatar under the skinnend mesh renederer. besides that one i didn't find any other meshes in the prefab. i manually assigned this prefab but it didn't change anything what am i missing?
@@RossTubes I do apologize. I haven’t worked with this kind of thing in unity for so long. I’m making my own game now that doesn’t require this. Unfortunately I Can’t Answer your question now
VRChat is a RPM partner thus they have their own subdomain. You should not use their models at all, you should make an avatar directly from RPM website. VRChat version of the avatars will not have ARKit blendshapes.
Hi Sercan, this method is working. when i try with my 3D Chracter that already downloaded 2-3 day ego it's working. But now i try with new avatars, face mapping is not working. just head move and face smile. How can i fix it ? Can it problem with readyplayerme, maybe they bring a new system on last day ? Still is it working can you try ?
Hours and hours to find out what was going on, I'll tell you how I managed to do it. When importing the avatar you have to add at the end of the URL , despues de glb. ?morphTargets=mouthSmile,ARKit and it worked perfectly, I hope it works for you.
Hi Sercan! Thanks for the tutorial, love the straightforward format! I did it a try with the newest version of the libs and I can't get it right. I'm getting numbers out of the 0-1 range from Live Capture, which makes the RPM model deformation. Do you have any tips to work around this issue?
@@harunuyumaz3320 yup, I did what Sercan suggested. You need to create a SimpleEvaluator and link it to your FaceMapper, adjusting the evaluator multiplier to 1 (default is 100) worked for me.
It was a MacBook Pro, not really the best device for this type of work. While recording the fans were blowing really loud. I matched the video to phone recording myself. There was also a slight delay.
Thanks for this helpful tutorial. Is it possible to use pre-recorded videos instead of live tracking? I want to use the tool freely with other videos downloaded from the web.
Actually if you want to use recorded facial animation, buy face capture app in apple store and record the animation. If you export file and import in unity, u can use recorded face
Hi Sercan, thanks for your video! I have made it! But, I have a question, sometimes the head doesn’t rotation in right direction, I have to fix the joint orientation in maya, it’s complicated. Is there any better way set on unity3d?
Hi Diego, A blend shapes (in other name, morph targets) is saved vertex positions of another form of the same model. Interpolating from main shape to another gives us very smooth animations a bit harder or more complex to achieve by bones. For example, the head model we have in Ready Player Me Avatar by default has its mouth closed, this is our base shape. Then there is a `mouth_open` blend shape, the same model but modelled as it's mouth open saved as a blendshape. Interpolating from base shape to `mouth_open` blendshape gives us an animation where mouth opens. Unity face capture app provides us these values , so the face animates smoothly without having any bones in there.
Thank you for the reply @@sgt3v Is there a way to calibrate that in order for a more faithfully Mocap with RPM? Also, did you manage to solve that position issue when dealing with the Face Device?
@@diegooriani Unity Live Capture package also provides settings to amplify or reduce the effect of a blendshape on the mapper. You can play around with the values there. By position issue do you mean the avatar moving down when Face Capture runs? If so that is expected and happens since you are in edit mode and no body animating runs at the time.
@@sgt3v thank you for the heads-up. Yes, I meant the avatar position. I am finding a bit annoying to use a second avatar prefab to setup the scene and lights. But Sélavi… Anyway, I hope you will continue creating more videos about the meta and XR. 👍
@@diegooriani Yes. You can open up your avatar in blender, and adjust the constraints of every blendshape/morph/shape key. Unity calls them blendshapes. Blender calls them Shape Keys, Maya calls them morphs. They're called something different in every 3d program. But yeah, it's no problem. Each one has a value between 1 and 100, and you can adjust them before you get to unity. Works beautifully. Generally though, they're pretty good if they meet spec. The only one that's problematic just about everywhere is "moutClose." If you use it at 100 without modifying it, it'll do some really weird things to your avatar
Hey thanks for this video. I've followed your tutorial but when I try to match my face with the avatar the avatar's mouth crashes. It seems not connected with the skeleton. Can you help me?
Hi Marko, now that Ready Player Me avatars are requested with certain configurations in Unity the process must be a little different. Can you check if your avatar had ARKit blendshapes and their range of value. The range might be [0, 1] where as you applied value [0, 100]
How wonderful, thank you so much! It works perfect and I don't know how to use unity! I make a series of humor and horror on my channel by myself and this helps me a lot, I learn to use unity while I make chapters! I invite you to watch them! Thank you,...thank you
Hey Sercan, thanks for the great tutorial. All works well. My issue is the avatar dropping to the floor (like it happens in your video). My avatar doesn't have a 'Head' sub object that I could select to fix. How to address this? Thanks
Hi! If you have a single mesh avatar selecting the Avatar mesh should be sufficient. Avatar dropping on the floor is just beacuse it does not have an animator controller, you can add it and it will be fine.
We provide both single mesh avatars or meshes separated, if your avatar is single mesh then you can just use the whole body. Or pick the head mesh and use it.
Hi! Most probably you have another Unity Package that brings Newtonsoft in the project. You can safely delete the Newtonsoft folder that comes with RPM.
No, as far as I know, this package is intended for recording face captures via the companion app. For real-time face capture with iPhone, you can use ARKit XR Plugin of Unity.
Hi Sercan, this method is working. when i try with my 3D Chracter that already downloaded 2-3 day ego it's working. But now i try with new avatars, face mapping is not working. just head move and face smile. How can i fix it ? Can it problem with readyplayerme, maybe they bring a new system on last day ? Still is it working can you try ?
Could you make sure you are making an avatar with ARKit blendshapes? You will need to assign a AvatarConfig in Settings window that has ARKit blendshapes enabled.
Checkout my new project Neocortex for simplest way to integrate Smart NPCs in your games
neocortex.link
Hi Sercan, going to share you video with my college students. Great step by step. Looking forward to the next one.
hello, first of all thanks for the video. selected as the browdown of blendshape at 10:48 in the video. But in my unity screen I have no browdown option, instead only mouthopen and mouthsmile. The character's face turns left and right, but his mouth, eyebrows and eyes remain fixed. How can I solve it?
Since than we changed the the system a lot, this tutorial become somewhat outdated. You need to download an avatr with avatarconfig that includes ARKit blendshapes.
docs.readyplayer.me/ready-player-me/integration-guides/unity/optimize/avatar-configuration
@@sgt3v Sercan reis sen Türk müydü bea 😃. Can you make an updated tutorial of this?
18:31 I cant wait to see that tutorial that'd be magnificent!!!
Where is the updated face capture video?
Never knew this. Thanks a lot!
This is so awesome!!! I need to sit down and try it out!!! Can you do animations for the body and arms too?
Thank you! It is not possible with LiveCapture but Google Mediapipe could be implemented on top to get that done.
@@sgt3v ohhh!! ty ty --- def gonna look into it! Thank you for the tip
Sercan, emeğine sağlık kardeşim.
Man looks like the NPC from GTA. 😂
Nice tutorial, bro.
loved it. Can you make a tutorial about face retargeting another character?
Thank Attila! As long as the head model has all the ARKit blendshapes this approach should work with any character.
@@INGame. You need to use Apple guide for ARKit blendershapes and add them all one by one. Or you might wanna use a tool that helps with generating the blendshapes.
Brilliant video works perfectly thank you.
This is the best I've seen and I really appreciate you taking the time. You said at the end that you were planning a tutorial for demonstrating the face capture with a second layer combining Mixamo animations with the custom face tracking animations. That is what I need desperately. Have you done it? Is there a video you can recommend for the layer process if you're not going to be able to do it soon? This is an urgent need. I appreciate your help!
got something?
When I put my avatar in the Rig Prefab item at the face mapper stage, this warning message appears.
The rig’s SkinnedMeshRenderer components have not been assigned meshes with blend shapes, so blend shape mapping cannot be used
And when I create and import a character from ready player me following your map, unlike you, I look at the avatar's Inspector and it's named 'Renderer_avatar, not Wolf 3D Avatar.
@@Robloxzz this tutorial became a bit outdated on RPM side, you will need to download an avatar with Avatar Config in Unity
docs.readyplayer.me/ready-player-me/integration-guides/unity/optimize/avatar-configuration
Hi Sercan, new to the latest facial recognition in Unity. I had to remove the Newton Soft Json plugin when importing the ready player me sdk for unity package, otherwise it gave me an error of having two of the same plugins. Don't know if that will happen for others. Excellent video. Took me an afternoon to get things running.
Thank you. Yes this is a known issue due to Version Control package having the Newtonsoft library as a dependency. You can find other known issues here: docs.readyplayer.me/integration-guides/unity/troubleshooting
tysm! 🙏
Does this only work on the Editor? Can you get live facial animations on builds with this?
Thank you for this. I was looking for a way of getting here without Vseeface. This should do it. Thanks again
Thank you so much :0)
would love a reply even though I'm somewhat late to the party. I'm having trouble adding the renderers at 10:30. All it shows is empty. And it tells me that my Ready Player Me model doesn't have blend shapes
Is it from demo.readyplayer.me or another subdomain?
@@sgt3v it was from readyplayerme. I figured it out and I’ll explain for others if they have the problem. You have to open your prefab in prefab mode and set the meshes manually. It didn’t make sense why it fixed it but it did.
@@twhippp hii, i am still working on this but i don't understand what you mean with prefab mode changing the meshes. the only mesh i found so far was the mesh of the skinnedmesh avatar under the skinnend mesh renederer. besides that one i didn't find any other meshes in the prefab. i manually assigned this prefab but it didn't change anything what am i missing?
@@RossTubes I do apologize. I haven’t worked with this kind of thing in unity for so long. I’m making my own game now that doesn’t require this. Unfortunately I Can’t Answer your question now
@@twhippp I understand do you maybe have the old project somewhere? So i can use it?
looks great. modeling those 52 arkit blendshapes is a loooot of work. :)
Yes, Ready Player Me provides all of them and saves a lot of time!
@@sgt3v how do you get the 52 blendshapes? I haven’t seen them in unity do I need to make a certain ready player me model? Glb or fbx?
Avatars from www.readyplayer.me has 52 ARKit blendshapes on them. RPM partners though have their own configurations.
@@sgt3v what do you mean partners? so on the website i cant use a vrchat model i have to create the ready playerme one?
VRChat is a RPM partner thus they have their own subdomain. You should not use their models at all, you should make an avatar directly from RPM website. VRChat version of the avatars will not have ARKit blendshapes.
also thank you for doing this!
Hi Sercan, this method is working. when i try with my 3D Chracter that already downloaded 2-3 day ego it's working. But now i try with new avatars, face mapping is not working. just head move and face smile. How can i fix it ? Can it problem with readyplayerme, maybe they bring a new system on last day ? Still is it working can you try ?
i have the same problem
Hours and hours to find out what was going on, I'll tell you how I managed to do it.
When importing the avatar you have to add at the end of the URL , despues de glb.
?morphTargets=mouthSmile,ARKit
and it worked perfectly, I hope it works for you.
Very good thank you. I will use it!
Can we use it in android phones?
Sadly it's works only on IOS
Hi Sercan!
Thanks for the tutorial, love the straightforward format! I did it a try with the newest version of the libs and I can't get it right. I'm getting numbers out of the 0-1 range from Live Capture, which makes the RPM model deformation. Do you have any tips to work around this issue?
Hi Kaio, this tutorial seems to be outdated now. You may try to map the values to the desired range which should solve the problem.
did you solve this problem?
@@harunuyumaz3320 yup, I did what Sercan suggested. You need to create a SimpleEvaluator and link it to your FaceMapper, adjusting the evaluator multiplier to 1 (default is 100) worked for me.
I enjoyed the video.
I can feel a slight delay. What is the mac specification you used?
It was a MacBook Pro, not really the best device for this type of work. While recording the fans were blowing really loud. I matched the video to phone recording myself. There was also a slight delay.
Hello Sercan, great vid! Have a question. How can we do a full body capture to fully animate the avatar?
Hi Mekan,
You can use Deep Motion or similar software to use RPM avatars in, to create fullbody tracking along with face tracking.
@@sgt3v thank you, will try it 🙏🏾
Thanks for this helpful tutorial. Is it possible to use pre-recorded videos instead of live tracking? I want to use the tool freely with other videos downloaded from the web.
As far as I know it will only work via an iPhone captured live from camera.
Actually if you want to use recorded facial animation, buy face capture app in apple store and record the animation. If you export file and import in unity, u can use recorded face
Thanks for your answers. I will have a look on the package on the App Store.
Sir Once i build and test on my iPhone, The model is not moving, What do i need to do for that ?
How make a vivid facial expressions? Face rig or blend shape? And how to do that
nice video... can you use your avatar on obs studio for youtube
Sure, you can use Animaze for it.
Hi Sercan, thanks for your video! I have made it! But, I have a question, sometimes the head doesn’t rotation in right direction, I have to fix the joint orientation in maya, it’s complicated. Is there any better way set on unity3d?
My phone says could not connect to the server, what should i do?
hello! can we record this with body tracking as well? : )
Hey Sercan. Thank you for the video. Could you elaborate on the blendshapes? It would be great to understand those.
Hi Diego,
A blend shapes (in other name, morph targets) is saved vertex positions of another form of the same model. Interpolating from main shape to another gives us very smooth animations a bit harder or more complex to achieve by bones.
For example, the head model we have in Ready Player Me Avatar by default has its mouth closed, this is our base shape. Then there is a `mouth_open` blend shape, the same model but modelled as it's mouth open saved as a blendshape. Interpolating from base shape to `mouth_open` blendshape gives us an animation where mouth opens.
Unity face capture app provides us these values , so the face animates smoothly without having any bones in there.
Thank you for the reply @@sgt3v Is there a way to calibrate that in order for a more faithfully Mocap with RPM? Also, did you manage to solve that position issue when dealing with the Face Device?
@@diegooriani Unity Live Capture package also provides settings to amplify or reduce the effect of a blendshape on the mapper. You can play around with the values there. By position issue do you mean the avatar moving down when Face Capture runs? If so that is expected and happens since you are in edit mode and no body animating runs at the time.
@@sgt3v thank you for the heads-up. Yes, I meant the avatar position. I am finding a bit annoying to use a second avatar prefab to setup the scene and lights. But Sélavi… Anyway, I hope you will continue creating more videos about the meta and XR. 👍
@@diegooriani Yes. You can open up your avatar in blender, and adjust the constraints of every blendshape/morph/shape key. Unity calls them blendshapes. Blender calls them Shape Keys, Maya calls them morphs. They're called something different in every 3d program. But yeah, it's no problem. Each one has a value between 1 and 100, and you can adjust them before you get to unity. Works beautifully. Generally though, they're pretty good if they meet spec. The only one that's problematic just about everywhere is "moutClose." If you use it at 100 without modifying it, it'll do some really weird things to your avatar
Hey thanks for this video. I've followed your tutorial but when I try to match my face with the avatar the avatar's mouth crashes. It seems not connected with the skeleton. Can you help me?
Hi Marko, now that Ready Player Me avatars are requested with certain configurations in Unity the process must be a little different. Can you check if your avatar had ARKit blendshapes and their range of value. The range might be [0, 1] where as you applied value [0, 100]
Thank you 😎
How wonderful, thank you so much!
It works perfect and I don't know how to use unity!
I make a series of humor and horror on my channel by myself and this helps me a lot, I learn to use unity while I make chapters! I invite you to watch them! Thank you,...thank you
Hey Sercan, thanks for the great tutorial. All works well. My issue is the avatar dropping to the floor (like it happens in your video). My avatar doesn't have a 'Head' sub object that I could select to fix. How to address this? Thanks
Hi! If you have a single mesh avatar selecting the Avatar mesh should be sufficient. Avatar dropping on the floor is just beacuse it does not have an animator controller, you can add it and it will be fine.
@@sgt3vcan you explain a bit more I am still having this issue
We provide both single mesh avatars or meshes separated, if your avatar is single mesh then you can just use the whole body. Or pick the head mesh and use it.
It's showing page not found of rpm unity sdk
Updated.
Hey Sercan! I liked the video but when I tried, it said there were two Newtonsoft.json.dll files. What does that mean? And is there a way to fix it?
Hi! Most probably you have another Unity Package that brings Newtonsoft in the project. You can safely delete the Newtonsoft folder that comes with RPM.
I tried doing that, and it says its unable to delete it since it's a read-only file.
Could you try quiting Unity and then deleting it.
@@sgt3v it worked, thanks sercan!
can i head track like this in a mobile app??? can i use it like that?
Yes, you can use Unity ARKit package afaik docs.unity3d.com/Packages/com.unity.xr.arkit@4.1/manual/index.html
Is there a way to use this with a camera on the machine (webcam) rather than using the camera on the phone?
Not with this unity package, it requires a iPhone camera with depth sensor for the ARKit support.
@@sgt3v Oh okay, thanks for getting back to me. Nice tutorial.
Plz explain me can we use Android
Sadly no, this Unity feature is IOS only.
Salut est ce que ça fonctionne sans l'application Iphone ou c'est obligatoire ?
Malheureusement c'est obligatoire.
@@sgt3v d’accord merci de ta réponse !
is this suppoirt or available for windows and android? bcs im gonna make that for my school project, sorry can't afford a Apple device🙏
Unfortunately, this is IOS only :(
@@sgt3v so, are you have a recomendation or alternate a system like this for windows and android?
does face capture app works on unity and if no is there an alternative for it ?
I suppose you meant to ask if it works with Android, in that case it does not.
@@sgt3v yes I meant Android , thank you for your response and explanation
My project dosn't open the window where insert the link how can i solve ?
Did you download and import the RPM Unity SDK?
will it work in windows and iphone?
Unfortunately no.
mouth is not opening any fix ?????
Make sure you have an avatar with ARKit blendshapes on it, and did the mapping correctly.
does it work with any genesis figure?
I do no know what a genesis figure it, but as long as it's a 3D model with ARKit blendshapes it should work the same.
Is it also working when you do a build?
No, as far as I know, this package is intended for recording face captures via the companion app. For real-time face capture with iPhone, you can use ARKit XR Plugin of Unity.
@@sgt3v thx good to know
I can use this in other character ?
If the character has ARKit blendshapes then sure.
Hi Sercan, this method is working. when i try with my 3D Chracter that already downloaded 2-3 day ego it's working. But now i try with new avatars, face mapping is not working. just head move and face smile. How can i fix it ? Can it problem with readyplayerme, maybe they bring a new system on last day ? Still is it working can you try ?
Could you make sure you are making an avatar with ARKit blendshapes?
You will need to assign a AvatarConfig in Settings window that has ARKit blendshapes enabled.