Google Mediapipe Face Tracking in React with Ready Player Me Avatars!

Поділитися
Вставка
  • Опубліковано 20 сер 2024

КОМЕНТАРІ • 50

  • @jaumaras
    @jaumaras Рік тому

    That's great. It's just what was missing in the integration you've set up. I'm already waiting for your next videos, I've seen them all. Thanks a lot

  • @tarunsukhu2614
    @tarunsukhu2614 Рік тому

    I had earlier done this with a python backend for blendshape predictions , but this is the best way to do it. Thank you for the detailed explanation and the code for the same.

    • @Cyb3r-Kun
      @Cyb3r-Kun 3 місяці тому

      hay I'm looking exactly how to do blendshape predictions in python, could you share how to do?

  • @iramshaiq2691
    @iramshaiq2691 Рік тому

    Your videos are really helpful. Hope you continue making such useful videos😊

  • @EzekielMaina-ws2ld
    @EzekielMaina-ws2ld 3 місяці тому

    Hi Sarge, great stuff.
    Any leads/link on how to do for full body, a cheatsheat for mapping the poses to the avatar(parts) will also help
    Thanks

  • @hiranw1
    @hiranw1 11 місяців тому

    This is really cool. Thank You.

  • @chivara91
    @chivara91 2 місяці тому

    exemple with holistic traking

  • @Cyb3r-Kun
    @Cyb3r-Kun 3 місяці тому

    hay, great vid although I'd really like a video of how to implement this in python with blendshape predictions. could you make a video on that?

  • @lemon3335
    @lemon3335 5 місяців тому

    Can this function be implemented in ue4c++ and then complete the model in UE4 like AR?

  • @muhammadaasharibnawshad9546
    @muhammadaasharibnawshad9546 3 місяці тому

    Hi @Sarge really cool project. Can you please tell if i want to do full body tracking and then make the avatar do the same. is it possible? what can i do to implement this using your current approach?

  • @xscore4811
    @xscore4811 4 місяці тому

    Is it possible to make the avatar mimic hand signs? Or is it only for the face?

  • @fofoasiri404
    @fofoasiri404 10 місяців тому

    💔💔 not under stand i need help😢

  • @bobhawkey3783
    @bobhawkey3783 Рік тому

    Soooooooooo....Facerig but open source! Cool.

  • @m.ammarraza3957
    @m.ammarraza3957 11 місяців тому

    thank you very much! can we use similalr approach for whole upper body

  • @theboringscientist7226
    @theboringscientist7226 Рік тому

    how can i done this with using mediapipe points?

  • @EricTsai-rj9vm
    @EricTsai-rj9vm Рік тому

    thanks a lot

  • @thegrey448
    @thegrey448 Рік тому

    hi thanks i found it. ❤

  • @yujin1569
    @yujin1569 2 місяці тому

    I want to replace my face with a cat face. or with a model I draw in eg.solidworks. Can you give me some advice?

    • @sgt3v
      @sgt3v  2 місяці тому

      As long as the model has the same bone structure and orientations it should work. You can also get a model from Ready Player Me and modify it.

  • @nicolasportu
    @nicolasportu Рік тому

    Outstanding! Hey, I am really struggling with Unity WebGL trying to stablish a good communication between browser and Ready player me blendshapes... I can't send multiple parameters with unity instance SendMessage("myGameObect", "myMethod", shape.categoryName, shape score). I ve tried with JSON utility, but I am not an expert. Any idea? Thanks!

  • @andy111007
    @andy111007 10 місяців тому

    Thanks Serge, is it possible to integrate with Unity?

  • @gourangacharan6957
    @gourangacharan6957 11 місяців тому

    Hi Sarge, thank you for this amazing tutorial. Given that Mediapipe includes pose landmark detection capabilities, could we use a similar approach for hand transformations? Any thoughts on this?

    • @m.ammarraza3957
      @m.ammarraza3957 11 місяців тому

      sir have you done some work over it?

  • @gabrieljreed
    @gabrieljreed Рік тому

    Awesome video! Thanks so much for posting. How could you mirror the webcam stream so it's a mirror image of what the user is doing?

    • @gabrieljreed
      @gabrieljreed Рік тому +1

      To anyone else looking for a solution: I did it by setting the canvas styling to transform: "scaleX(-1);"

    • @sgt3v
      @sgt3v  Рік тому

      @@gabrieljreed yep, css should do the trick.

  • @narrowspace8552
    @narrowspace8552 Рік тому

    Hi Sarge, thank you for the tutorial. I was wondering if this can be achieved in Unity?

    • @sgt3v
      @sgt3v  Рік тому

      I have a video for how this can be done with iPhone however mediapipe is not yet available for Unity.

  • @iramshaiq2691
    @iramshaiq2691 Рік тому

    Can we use AR Kit Blendshapes only for live facial tracking or it can be used for doing something else?

    • @sgt3v
      @sgt3v  Рік тому

      Anything face animation related.

    • @iramshaiq2691
      @iramshaiq2691 Рік тому

      ​@@sgt3v Thanks for reply. Actually I am trying to change the facial expressions of my RPM character. But no matter what I do, I am not able to changes its expressions
      Would you give any suggestion in this regard? I would be really grateful to you

    • @iramshaiq2691
      @iramshaiq2691 Рік тому

      Whenever I select arkit blendshapes, nothing happens. Do I have to write script for it?

    • @sgt3v
      @sgt3v  Рік тому

      You need to request the avatar with ARKit param you can read about all options in the documentation: docs.readyplayer.me/ready-player-me

  • @johannaalejandracastillote1018

    Is it necessary to create an app why you cannot use the mesh track to make a video? 😢 well I guess at least we have Chat Gpt , Either case good info

    • @sgt3v
      @sgt3v  Рік тому +1

      I do not think I understand your question. Mediapipe is available in Python, Web and Android, thus I made a web app where result can be accessible by most viewers. You can check Mediapipe documentation for more.

  • @yujin1569
    @yujin1569 2 місяці тому

    Thank you, your video is great. A problem is it works well with your avater, but not well with my avatar. I create avater with ready-player-me today. Today is one year after your video. Is it because the avater starndard changed? I print 'mesh.morphTargetDictionary[element.categoryName]' on console and 'half' of them is 'undefined'.

    • @yujin1569
      @yujin1569 2 місяці тому

      let me try again. maybe I created full body avater, this might be the reason.

  • @hamnakhawar865
    @hamnakhawar865 Рік тому

    ARKit is accessible by iOS users, are there any alternatives available for Windows?

    • @sgt3v
      @sgt3v  Рік тому

      This is about ARKit blendshapes, not ARKit. ARKit blendshapes come with RPM avatars.

    • @hamnakhawar865
      @hamnakhawar865 Рік тому

      @@sgt3v but the ARKit blendshapes not working when selected in the avatar config in unity

  • @__________________________6910

    Can we do the same thing with python ??

    • @sgt3v
      @sgt3v  Рік тому

      yep, mediapipe has python lib as well

    • @__________________________6910
      @__________________________6910 Рік тому

      @@sgt3v yes but can I do the character animation?

    • @sgt3v
      @sgt3v  Рік тому

      If you could use a 3D renderer in python then yes it should work. In case of web its based on react-three-fiber.

    • @__________________________6910
      @__________________________6910 Рік тому

      @@sgt3v 🙂

    • @sgt3v
      @sgt3v  Рік тому

      @@__________________________6910
      This project might help you: github.com/mmatl/pyrender

  • @xlgablx
    @xlgablx Рік тому

    Can this be used to combine any chatbot with that avatar?

    • @sgt3v
      @sgt3v  Рік тому

      Yes.

    • @xlgablx
      @xlgablx Рік тому

      @@sgt3v Do you have any tutorial to combine, for example, wit bot or chatgpt + lip movement?