Ready Player Me x Unity Live Capture / Face Capture App

Поділитися
Вставка
  • Опубліковано 4 січ 2025

КОМЕНТАРІ • 124

  • @sgt3v
    @sgt3v  Місяць тому

    Checkout my new project Neocortex for simplest way to integrate Smart NPCs in your games
    neocortex.link

  • @pixelshoppe6253
    @pixelshoppe6253 2 роки тому +1

    Hi Sercan, going to share you video with my college students. Great step by step. Looking forward to the next one.

  • @maarifhasan4587
    @maarifhasan4587 Рік тому +2

    hello, first of all thanks for the video. selected as the browdown of blendshape at 10:48 in the video. But in my unity screen I have no browdown option, instead only mouthopen and mouthsmile. The character's face turns left and right, but his mouth, eyebrows and eyes remain fixed. How can I solve it?

    • @sgt3v
      @sgt3v  Рік тому +1

      Since than we changed the the system a lot, this tutorial become somewhat outdated. You need to download an avatr with avatarconfig that includes ARKit blendshapes.
      docs.readyplayer.me/ready-player-me/integration-guides/unity/optimize/avatar-configuration

    • @maarifhasan4587
      @maarifhasan4587 Рік тому

      @@sgt3v Sercan reis sen Türk müydü bea 😃. Can you make an updated tutorial of this?

  • @micharomak5290
    @micharomak5290 Рік тому +1

    18:31 I cant wait to see that tutorial that'd be magnificent!!!

  • @muratefendi876
    @muratefendi876 Рік тому +2

    Where is the updated face capture video?

  • @Tmoproject
    @Tmoproject 2 місяці тому

    Never knew this. Thanks a lot!

  • @reinasalti
    @reinasalti Рік тому

    This is so awesome!!! I need to sit down and try it out!!! Can you do animations for the body and arms too?

    • @sgt3v
      @sgt3v  Рік тому +1

      Thank you! It is not possible with LiveCapture but Google Mediapipe could be implemented on top to get that done.

    • @reinasalti
      @reinasalti Рік тому +1

      @@sgt3v ohhh!! ty ty --- def gonna look into it! Thank you for the tip

  • @abikevser
    @abikevser 2 роки тому +2

    Sercan, emeğine sağlık kardeşim.

  • @Tropicaya
    @Tropicaya 2 роки тому

    Man looks like the NPC from GTA. 😂
    Nice tutorial, bro.

  • @atyy123
    @atyy123 2 роки тому +1

    loved it. Can you make a tutorial about face retargeting another character?

    • @sgt3v
      @sgt3v  2 роки тому

      Thank Attila! As long as the head model has all the ARKit blendshapes this approach should work with any character.

    • @sgt3v
      @sgt3v  2 роки тому

      @@INGame. You need to use Apple guide for ARKit blendershapes and add them all one by one. Or you might wanna use a tool that helps with generating the blendshapes.

  • @vrworld_games4158
    @vrworld_games4158 2 роки тому

    Brilliant video works perfectly thank you.

  • @sharonj731
    @sharonj731 2 роки тому +1

    This is the best I've seen and I really appreciate you taking the time. You said at the end that you were planning a tutorial for demonstrating the face capture with a second layer combining Mixamo animations with the custom face tracking animations. That is what I need desperately. Have you done it? Is there a video you can recommend for the layer process if you're not going to be able to do it soon? This is an urgent need. I appreciate your help!

  • @Robloxzz
    @Robloxzz Рік тому +1

    When I put my avatar in the Rig Prefab item at the face mapper stage, this warning message appears.
    The rig’s SkinnedMeshRenderer components have not been assigned meshes with blend shapes, so blend shape mapping cannot be used

    • @Robloxzz
      @Robloxzz Рік тому +1

      And when I create and import a character from ready player me following your map, unlike you, I look at the avatar's Inspector and it's named 'Renderer_avatar, not Wolf 3D Avatar.

    • @sgt3v
      @sgt3v  Рік тому

      @@Robloxzz this tutorial became a bit outdated on RPM side, you will need to download an avatar with Avatar Config in Unity
      docs.readyplayer.me/ready-player-me/integration-guides/unity/optimize/avatar-configuration

  • @pixelshoppe6253
    @pixelshoppe6253 2 роки тому +3

    Hi Sercan, new to the latest facial recognition in Unity. I had to remove the Newton Soft Json plugin when importing the ready player me sdk for unity package, otherwise it gave me an error of having two of the same plugins. Don't know if that will happen for others. Excellent video. Took me an afternoon to get things running.

    • @sgt3v
      @sgt3v  2 роки тому +1

      Thank you. Yes this is a known issue due to Version Control package having the Newtonsoft library as a dependency. You can find other known issues here: docs.readyplayer.me/integration-guides/unity/troubleshooting

    • @automata4466
      @automata4466 2 роки тому +1

      tysm! 🙏

  • @makled
    @makled Місяць тому

    Does this only work on the Editor? Can you get live facial animations on builds with this?

  • @PriestessOfDada
    @PriestessOfDada 2 роки тому +1

    Thank you for this. I was looking for a way of getting here without Vseeface. This should do it. Thanks again

    • @sgt3v
      @sgt3v  2 роки тому

      Thank you so much :0)

  • @twhippp
    @twhippp 2 роки тому

    would love a reply even though I'm somewhat late to the party. I'm having trouble adding the renderers at 10:30. All it shows is empty. And it tells me that my Ready Player Me model doesn't have blend shapes

    • @sgt3v
      @sgt3v  2 роки тому

      Is it from demo.readyplayer.me or another subdomain?

    • @twhippp
      @twhippp 2 роки тому

      @@sgt3v it was from readyplayerme. I figured it out and I’ll explain for others if they have the problem. You have to open your prefab in prefab mode and set the meshes manually. It didn’t make sense why it fixed it but it did.

    • @RossTubes
      @RossTubes Рік тому

      @@twhippp hii, i am still working on this but i don't understand what you mean with prefab mode changing the meshes. the only mesh i found so far was the mesh of the skinnedmesh avatar under the skinnend mesh renederer. besides that one i didn't find any other meshes in the prefab. i manually assigned this prefab but it didn't change anything what am i missing?

    • @twhippp
      @twhippp Рік тому +1

      @@RossTubes I do apologize. I haven’t worked with this kind of thing in unity for so long. I’m making my own game now that doesn’t require this. Unfortunately I Can’t Answer your question now

    • @RossTubes
      @RossTubes Рік тому

      @@twhippp I understand do you maybe have the old project somewhere? So i can use it?

  • @facemotion3d971
    @facemotion3d971 3 роки тому +1

    looks great. modeling those 52 arkit blendshapes is a loooot of work. :)

    • @sgt3v
      @sgt3v  3 роки тому

      Yes, Ready Player Me provides all of them and saves a lot of time!

    • @OreoTheDJ
      @OreoTheDJ 2 роки тому

      @@sgt3v how do you get the 52 blendshapes? I haven’t seen them in unity do I need to make a certain ready player me model? Glb or fbx?

    • @sgt3v
      @sgt3v  2 роки тому

      Avatars from www.readyplayer.me has 52 ARKit blendshapes on them. RPM partners though have their own configurations.

    • @OreoTheDJ
      @OreoTheDJ 2 роки тому

      @@sgt3v what do you mean partners? so on the website i cant use a vrchat model i have to create the ready playerme one?

    • @sgt3v
      @sgt3v  2 роки тому

      VRChat is a RPM partner thus they have their own subdomain. You should not use their models at all, you should make an avatar directly from RPM website. VRChat version of the avatars will not have ARKit blendshapes.

  • @michellevawer5207
    @michellevawer5207 2 роки тому +1

    also thank you for doing this!

  •  2 роки тому +1

    Hi Sercan, this method is working. when i try with my 3D Chracter that already downloaded 2-3 day ego it's working. But now i try with new avatars, face mapping is not working. just head move and face smile. How can i fix it ? Can it problem with readyplayerme, maybe they bring a new system on last day ? Still is it working can you try ?

    • @oscardorianchannel
      @oscardorianchannel Рік тому

      i have the same problem

    • @oscardorianchannel
      @oscardorianchannel Рік тому

      Hours and hours to find out what was going on, I'll tell you how I managed to do it.
      When importing the avatar you have to add at the end of the URL , despues de glb.
      ?morphTargets=mouthSmile,ARKit
      and it worked perfectly, I hope it works for you.

  • @finansdoktoru
    @finansdoktoru 2 роки тому +1

    Very good thank you. I will use it!

    • @finansdoktoru
      @finansdoktoru 2 роки тому

      Can we use it in android phones?

    • @sgt3v
      @sgt3v  2 роки тому +1

      Sadly it's works only on IOS

  • @KaioDuarteds
    @KaioDuarteds Рік тому

    Hi Sercan!
    Thanks for the tutorial, love the straightforward format! I did it a try with the newest version of the libs and I can't get it right. I'm getting numbers out of the 0-1 range from Live Capture, which makes the RPM model deformation. Do you have any tips to work around this issue?

    • @sgt3v
      @sgt3v  Рік тому

      Hi Kaio, this tutorial seems to be outdated now. You may try to map the values to the desired range which should solve the problem.

    • @harunuyumaz3320
      @harunuyumaz3320 9 місяців тому

      did you solve this problem?

    • @KaioDuarteds
      @KaioDuarteds 9 місяців тому

      @@harunuyumaz3320 yup, I did what Sercan suggested. You need to create a SimpleEvaluator and link it to your FaceMapper, adjusting the evaluator multiplier to 1 (default is 100) worked for me.

  • @eggo4872
    @eggo4872 2 роки тому

    I enjoyed the video.
    I can feel a slight delay. What is the mac specification you used?

    • @sgt3v
      @sgt3v  2 роки тому +1

      It was a MacBook Pro, not really the best device for this type of work. While recording the fans were blowing really loud. I matched the video to phone recording myself. There was also a slight delay.

  • @flowerinpower
    @flowerinpower 2 роки тому

    Hello Sercan, great vid! Have a question. How can we do a full body capture to fully animate the avatar?

    • @sgt3v
      @sgt3v  2 роки тому +1

      Hi Mekan,
      You can use Deep Motion or similar software to use RPM avatars in, to create fullbody tracking along with face tracking.

    • @flowerinpower
      @flowerinpower 2 роки тому

      @@sgt3v thank you, will try it 🙏🏾

  • @TechTrek_Innovations
    @TechTrek_Innovations 3 роки тому

    Thanks for this helpful tutorial. Is it possible to use pre-recorded videos instead of live tracking? I want to use the tool freely with other videos downloaded from the web.

    • @sgt3v
      @sgt3v  3 роки тому

      As far as I know it will only work via an iPhone captured live from camera.

    • @gopm466
      @gopm466 2 роки тому

      Actually if you want to use recorded facial animation, buy face capture app in apple store and record the animation. If you export file and import in unity, u can use recorded face

    • @TechTrek_Innovations
      @TechTrek_Innovations 2 роки тому

      Thanks for your answers. I will have a look on the package on the App Store.

  • @Rizwan-Ashraf
    @Rizwan-Ashraf Рік тому

    Sir Once i build and test on my iPhone, The model is not moving, What do i need to do for that ?

  • @michaelli7000
    @michaelli7000 2 роки тому

    How make a vivid facial expressions? Face rig or blend shape? And how to do that

  • @frutavidaycomida9562
    @frutavidaycomida9562 2 роки тому

    nice video... can you use your avatar on obs studio for youtube

    • @sgt3v
      @sgt3v  2 роки тому

      Sure, you can use Animaze for it.

  • @leezj8108
    @leezj8108 2 роки тому

    Hi Sercan, thanks for your video! I have made it! But, I have a question, sometimes the head doesn’t rotation in right direction, I have to fix the joint orientation in maya, it’s complicated. Is there any better way set on unity3d?

  • @vedikagupta9131
    @vedikagupta9131 Рік тому

    My phone says could not connect to the server, what should i do?

  • @thatOne873
    @thatOne873 2 роки тому +1

    hello! can we record this with body tracking as well? : )

  • @diegooriani
    @diegooriani 3 роки тому

    Hey Sercan. Thank you for the video. Could you elaborate on the blendshapes? It would be great to understand those.

    • @sgt3v
      @sgt3v  3 роки тому +2

      Hi Diego,
      A blend shapes (in other name, morph targets) is saved vertex positions of another form of the same model. Interpolating from main shape to another gives us very smooth animations a bit harder or more complex to achieve by bones.
      For example, the head model we have in Ready Player Me Avatar by default has its mouth closed, this is our base shape. Then there is a `mouth_open` blend shape, the same model but modelled as it's mouth open saved as a blendshape. Interpolating from base shape to `mouth_open` blendshape gives us an animation where mouth opens.
      Unity face capture app provides us these values , so the face animates smoothly without having any bones in there.

    • @diegooriani
      @diegooriani 3 роки тому

      Thank you for the reply @@sgt3v Is there a way to calibrate that in order for a more faithfully Mocap with RPM? Also, did you manage to solve that position issue when dealing with the Face Device?

    • @sgt3v
      @sgt3v  3 роки тому +1

      @@diegooriani Unity Live Capture package also provides settings to amplify or reduce the effect of a blendshape on the mapper. You can play around with the values there. By position issue do you mean the avatar moving down when Face Capture runs? If so that is expected and happens since you are in edit mode and no body animating runs at the time.

    • @diegooriani
      @diegooriani 3 роки тому +1

      @@sgt3v thank you for the heads-up. Yes, I meant the avatar position. I am finding a bit annoying to use a second avatar prefab to setup the scene and lights. But Sélavi… Anyway, I hope you will continue creating more videos about the meta and XR. 👍

    • @PriestessOfDada
      @PriestessOfDada 2 роки тому

      @@diegooriani Yes. You can open up your avatar in blender, and adjust the constraints of every blendshape/morph/shape key. Unity calls them blendshapes. Blender calls them Shape Keys, Maya calls them morphs. They're called something different in every 3d program. But yeah, it's no problem. Each one has a value between 1 and 100, and you can adjust them before you get to unity. Works beautifully. Generally though, they're pretty good if they meet spec. The only one that's problematic just about everywhere is "moutClose." If you use it at 100 without modifying it, it'll do some really weird things to your avatar

  • @MarcoGadaleta-v7h
    @MarcoGadaleta-v7h Рік тому

    Hey thanks for this video. I've followed your tutorial but when I try to match my face with the avatar the avatar's mouth crashes. It seems not connected with the skeleton. Can you help me?

    • @sgt3v
      @sgt3v  Рік тому

      Hi Marko, now that Ready Player Me avatars are requested with certain configurations in Unity the process must be a little different. Can you check if your avatar had ARKit blendshapes and their range of value. The range might be [0, 1] where as you applied value [0, 100]

  • @CHITUS
    @CHITUS 2 роки тому

    Thank you 😎

  • @oscardorianchannel
    @oscardorianchannel 2 роки тому +1

    How wonderful, thank you so much!
    It works perfect and I don't know how to use unity!
    I make a series of humor and horror on my channel by myself and this helps me a lot, I learn to use unity while I make chapters! I invite you to watch them! Thank you,...thank you

  • @R--Tech
    @R--Tech 2 роки тому

    Hey Sercan, thanks for the great tutorial. All works well. My issue is the avatar dropping to the floor (like it happens in your video). My avatar doesn't have a 'Head' sub object that I could select to fix. How to address this? Thanks

    • @sgt3v
      @sgt3v  2 роки тому

      Hi! If you have a single mesh avatar selecting the Avatar mesh should be sufficient. Avatar dropping on the floor is just beacuse it does not have an animator controller, you can add it and it will be fine.

    • @kenthigh8121
      @kenthigh8121 2 роки тому

      @@sgt3vcan you explain a bit more I am still having this issue

    • @sgt3v
      @sgt3v  2 роки тому

      We provide both single mesh avatars or meshes separated, if your avatar is single mesh then you can just use the whole body. Or pick the head mesh and use it.

  • @kritimukherjee2640
    @kritimukherjee2640 Місяць тому

    It's showing page not found of rpm unity sdk

    • @sgt3v
      @sgt3v  Місяць тому

      Updated.

  • @rishaansahoo
    @rishaansahoo 2 роки тому

    Hey Sercan! I liked the video but when I tried, it said there were two Newtonsoft.json.dll files. What does that mean? And is there a way to fix it?

    • @sgt3v
      @sgt3v  2 роки тому

      Hi! Most probably you have another Unity Package that brings Newtonsoft in the project. You can safely delete the Newtonsoft folder that comes with RPM.

    • @rishaansahoo
      @rishaansahoo 2 роки тому

      I tried doing that, and it says its unable to delete it since it's a read-only file.

    • @sgt3v
      @sgt3v  2 роки тому

      Could you try quiting Unity and then deleting it.

    • @rishaansahoo
      @rishaansahoo 2 роки тому

      @@sgt3v it worked, thanks sercan!

  • @RivenbladeS
    @RivenbladeS Рік тому

    can i head track like this in a mobile app??? can i use it like that?

    • @sgt3v
      @sgt3v  Рік тому

      Yes, you can use Unity ARKit package afaik docs.unity3d.com/Packages/com.unity.xr.arkit@4.1/manual/index.html

  • @fookustudios3279
    @fookustudios3279 2 роки тому

    Is there a way to use this with a camera on the machine (webcam) rather than using the camera on the phone?

    • @sgt3v
      @sgt3v  2 роки тому

      Not with this unity package, it requires a iPhone camera with depth sensor for the ARKit support.

    • @fookustudios3279
      @fookustudios3279 2 роки тому +1

      @@sgt3v Oh okay, thanks for getting back to me. Nice tutorial.

  • @grapescartoon5898
    @grapescartoon5898 3 роки тому +2

    Plz explain me can we use Android

    • @sgt3v
      @sgt3v  3 роки тому

      Sadly no, this Unity feature is IOS only.

  • @nullreferencegames
    @nullreferencegames Рік тому

    Salut est ce que ça fonctionne sans l'application Iphone ou c'est obligatoire ?

    • @sgt3v
      @sgt3v  Рік тому

      Malheureusement c'est obligatoire.

    • @nullreferencegames
      @nullreferencegames Рік тому

      @@sgt3v d’accord merci de ta réponse !

  • @sandikohler3861
    @sandikohler3861 2 роки тому

    is this suppoirt or available for windows and android? bcs im gonna make that for my school project, sorry can't afford a Apple device🙏

    • @sgt3v
      @sgt3v  2 роки тому

      Unfortunately, this is IOS only :(

    • @sandikohler3861
      @sandikohler3861 2 роки тому

      @@sgt3v so, are you have a recomendation or alternate a system like this for windows and android?

  • @Y_awwad
    @Y_awwad 2 роки тому

    does face capture app works on unity and if no is there an alternative for it ?

    • @sgt3v
      @sgt3v  2 роки тому

      I suppose you meant to ask if it works with Android, in that case it does not.

    • @Y_awwad
      @Y_awwad 2 роки тому +1

      @@sgt3v yes I meant Android , thank you for your response and explanation

  • @andrea59464
    @andrea59464 3 роки тому

    My project dosn't open the window where insert the link how can i solve ?

    • @sgt3v
      @sgt3v  3 роки тому

      Did you download and import the RPM Unity SDK?

  • @vedikagupta9131
    @vedikagupta9131 Рік тому

    will it work in windows and iphone?

    • @sgt3v
      @sgt3v  Рік тому

      Unfortunately no.

  • @betamotorrad
    @betamotorrad Рік тому

    mouth is not opening any fix ?????

    • @sgt3v
      @sgt3v  Рік тому

      Make sure you have an avatar with ARKit blendshapes on it, and did the mapping correctly.

  • @hanns1962
    @hanns1962 2 роки тому

    does it work with any genesis figure?

    • @sgt3v
      @sgt3v  2 роки тому

      I do no know what a genesis figure it, but as long as it's a 3D model with ARKit blendshapes it should work the same.

  • @ms_litto
    @ms_litto 2 роки тому

    Is it also working when you do a build?

    • @sgt3v
      @sgt3v  2 роки тому +1

      No, as far as I know, this package is intended for recording face captures via the companion app. For real-time face capture with iPhone, you can use ARKit XR Plugin of Unity.

    • @ms_litto
      @ms_litto 2 роки тому

      @@sgt3v thx good to know

  • @hosseintajeddini8643
    @hosseintajeddini8643 2 роки тому

    I can use this in other character ?

    • @sgt3v
      @sgt3v  2 роки тому +1

      If the character has ARKit blendshapes then sure.

  •  2 роки тому

    Hi Sercan, this method is working. when i try with my 3D Chracter that already downloaded 2-3 day ego it's working. But now i try with new avatars, face mapping is not working. just head move and face smile. How can i fix it ? Can it problem with readyplayerme, maybe they bring a new system on last day ? Still is it working can you try ?

    • @sgt3v
      @sgt3v  2 роки тому

      Could you make sure you are making an avatar with ARKit blendshapes?
      You will need to assign a AvatarConfig in Settings window that has ARKit blendshapes enabled.