Free face tracking update: New UI, Eyebrow tracking, new Blender and Unreal Files...and more!

Поділитися
Вставка
  • Опубліковано 20 жов 2024
  • Donation link: www.buymeacoff...
    Executable and files link
    github.com/rws...

КОМЕНТАРІ • 102

  • @lubruz7164
    @lubruz7164 4 роки тому +28

    I hope you get a Megagrant, its a great project. Congratulations for your work!

  • @tudortarta1903
    @tudortarta1903 3 роки тому +5

    I was just watching random ass blender videos and this got recommended . . . goddamn incredible

  • @russellmurphy3430
    @russellmurphy3430 4 роки тому +4

    😊 Thank you Rob. It's quite exciting. I still wonder how are able to do all that magic in your spare time. Inspired. Cheers. 👍🏾👍🏾👍🏾

  • @Vidyut_Gore
    @Vidyut_Gore Рік тому

    Liked and subscribed! Also I want to say I appreciate you working for a grant rather than monetizing on Patreon and subscriptions and such. I find many subscription channels end up creating content to feed the people who are paying and turn into more "UA-cam channel" media companies rather than remain focused on gamechanging projects. Of course, they are good for the wealthy of educational info, but I think niche stuff like this takes knowledge forward.

  • @borisrin3775
    @borisrin3775 3 роки тому +2

    Man from all my Heart - Thank You 🙏 You are an amazing person! Wish you all the best 💗

  • @amigoface
    @amigoface 4 роки тому +2

    definitly deserve a dev grant .
    this is awesome

  • @davidcripps3011
    @davidcripps3011 4 роки тому +4

    This is outstanding work! I'm enjoying watching it develop. Good luck with the grant :-)

  • @kendarr
    @kendarr 4 роки тому

    Niceee I've been checking this for the last few months, glad to see you're still having fun with it and getting this out to the community

  • @emo10001
    @emo10001 4 роки тому

    Rob this is great! I'll be trying this tonight and/or tomorrow night! Also...buying you a cup of coffee!

    • @Squarecoin
      @Squarecoin  4 роки тому

      Thanks so much! That's amazing. As it happens I was just animating something and have come across a behaviour of the 'lip funnel' shape under certain conditions that kinda wrecked the capture so just a heads up if you are using it today. I'll try and get a fix in for tomorrow. :)

  • @DavidLDaltonProductions
    @DavidLDaltonProductions 4 роки тому +2

    WOW! This is extremely generous of you, Rob. I wish you luck in getting the grant. Your work certainly deserves it. I do most of my work in DAZ Studio, but I'm fairly certain it's possible to export animations from Blender to DAZ. I'll be giving it a try in the very near future. Thanks again for the fantastic work!

    • @Squarecoin
      @Squarecoin  4 роки тому

      Thanks! Version 0.8 is out now and I'll be putting together a walkthrough of the new features tomorrow hopefully. Going to be filling out the grant application after that!

  • @The9thCell
    @The9thCell 3 роки тому +1

    I am honestly impressed with your work and I must congratulate you for it. I'm very curious to see how it evolves! Awesome and exciting stuff, for sure! Keep up the great work!

  • @infotoons212
    @infotoons212 4 роки тому +2

    Pressure applied:) thanks for doing this.

  • @MarkOfArgyll
    @MarkOfArgyll 4 роки тому

    Great progress! Fantastic work as usual, it's really great to see everything coming together nicely.

  • @wdblackout
    @wdblackout 3 роки тому

    Thank you Rob, this tool worked on my custom character.

  • @MonkeyManMods
    @MonkeyManMods 4 роки тому

    Amazing! Really going to follow along with as it would help tremendously with future projects

  • @scottlee38
    @scottlee38 4 роки тому

    I appreciate your hard work on this! I might try to use this along side with the Blender "Faceit" Addon.

    • @Squarecoin
      @Squarecoin  4 роки тому

      Yeah! I need to give that a look. Real time saver!

    • @scottlee38
      @scottlee38 4 роки тому

      @@Squarecoin Absolutely. i bought it. The weight painting demonstration in video is pretty legit. I have some complex faces from Daz3D characters and thats gonna save me loads of time.
      Funnily enough it released after i finished my own technical workflow documentation. now I've gotta make a change because of that addon!

    • @emo10001
      @emo10001 4 роки тому +1

      I also have faceit. That's what I'm going to try with some Make Human models

    • @emo10001
      @emo10001 4 роки тому

      I'd be interested in know how it works for you Scott! email4emo at gmail dot com

  • @Deech
    @Deech 4 роки тому

    This is so cool Rob! Can't wait to see more!

  • @carlosedubarreto
    @carlosedubarreto 3 роки тому

    I loved this solution. For me its better than the others I've tested because we can tweek it before sending to blender.
    Today I spend more than 4 hours testing it, I had some crashes at the begging, but now it is working great.
    I plan to make a decent animation to share on twitter to show more people about your solution.
    Do you use twitter? I would like to tag you there when I finish the decent animation (hopefully tomorrow)
    And great work you are doing!!!

  • @AiPhile
    @AiPhile 3 роки тому +1

    I have also planed work on that on my channel, I will use hand ✋ tracking, and pose estimation. Using module called mediapipe

  • @marcmordelet
    @marcmordelet Рік тому

    great work!
    I haven't tested it yet but your tracker seems to be very good.
    The export closed to a defined type of rig seems to me to be a bad idea for professional use.
    There is an option which seems to be missing in your program so that it may be useful in my work.
    Able to export tracking point animations in fbx.
    It would allow the use of your tracker with our own rigs.

  • @randomdudr
    @randomdudr 3 роки тому

    Subscribed and liked... pressure applied!!!

  • @LP12576
    @LP12576 4 роки тому +1

    Yey! Can't wait to try it

  • @frelsi6386
    @frelsi6386 3 роки тому +1

    Hii i have littel problem. When i click on export, the program crash. :( I was trying cut the 3 minutes video on smaller sequences but the software crashed when i wanted export my face tracking.

  • @PeppePascale_
    @PeppePascale_ 4 роки тому +3

    is this suitable for realtime animation through unreal bridge? i really dont wanna buy an iphoneX

  • @3dchannel488
    @3dchannel488 3 роки тому

    awesome..
    do we need to setup blendshapes or morph targets to our character to use in? does it required presetup blendshapes i mean ?, or it based on realtime deformation and skinning..?
    and i am curios about if we can recored the animation to use in maya..

  • @benyaminkhodabandeh
    @benyaminkhodabandeh 2 роки тому

    Hi Rob, I can't set key poses and the panel is disable in V0.8 ! What am I doing wrong?

  • @MarvinXOnline
    @MarvinXOnline 4 роки тому

    Love you bro. Keep this up!

    • @Squarecoin
      @Squarecoin  4 роки тому +1

      Thanks! Will do! There's a new update out on the github page now with a lot of new features. Tutorial tomorrow hopefully.

  • @jatingarg3770
    @jatingarg3770 3 роки тому

    It is really really wonderful. But i am not able to re-create the animation in Unreal. Can you share the level blueprint of the project? 'Strongtrack08UnrealExample' doesn't have the same blueprint as in this video.
    Thanks

  • @Blueoyster440
    @Blueoyster440 2 роки тому

    In the Blender Example File the Tracked Text File address has forward slash but the address from my windows has back slash... does this have anything to do with why I cant import a track?
    What I mean is, in the example file the text file reads;
    C:/Users/Robert/Desktop/anim_export.txt
    but my windows address reads;
    C:\Users\Bio\Desktop\Bio_02A.txt
    I have "\" instead of "/"

  • @saurabhdewangan4974
    @saurabhdewangan4974 3 роки тому +3

    When I'm trying to export the txt file, the window just closes. Anyone having the same issue? Please let me know if there's any solution.

    • @richlasma
      @richlasma 3 роки тому +2

      me too same problem

    • @llYuki0okami
      @llYuki0okami Рік тому

      to export without error, you need to make ALL extraction poses, not just few as program suggest...

  • @kleber1983
    @kleber1983 3 роки тому

    Hello, and great job you have here, I was wondering how many or which shape keys my model shoud have to be compatible with strongtrack? thank you!

  • @NixCM
    @NixCM 3 роки тому

    Does this export blendshape info like most of the ARkit apps? Im looking hard for a replacement for faceshift and ARkit isnt up to snuff!

  • @kamichikora6035
    @kamichikora6035 3 роки тому

    This thing... I'll be back when I try it out

  • @philippecoenen
    @philippecoenen 3 роки тому

    Hi, Waw great job!... I am trying to train a model (Tomorrow I will do some animation if I succeed. But for now, The iris point disappears... Is it a problem you know?

  • @georges8408
    @georges8408 3 роки тому

    very nice work thank you ! but how can we apply that facial animation in a specific character model in blender ? not sure how to do it

  • @nirnny
    @nirnny 3 роки тому

    amazing stuff! can it work with pre recorded video as well?

  • @Kurabiye.Canavari
    @Kurabiye.Canavari Рік тому

    Is it possible to transfer this to other 3D software like blender?

  • @philippecoenen
    @philippecoenen 3 роки тому

    I hope you will get the grant!

  • @llYuki0okami
    @llYuki0okami Рік тому

    When I click export, set name for thx file and click save, program crash with error:
    IndexError: index 1 is out of bounds for axis 0 with size 1
    EDIT: to export without error, you need to make ALL extraction poses, not just few as program suggest...

  • @Madlion
    @Madlion 4 роки тому

    Great work, may i ask which method did you use for decomposing the coefficients? i.e given a facial pose, how does your code decompose to the base poses like smile, open jaw etc. Im very interested in the technicality :)

    • @Squarecoin
      @Squarecoin  4 роки тому

      It's actually the easiest bit of the whole thing in a way; just using sparse coder from sci kit learn, which takes care of all the maths. If you look in the decomp_functions.py you'll see my very sloppy implementation in there.

    • @Madlion
      @Madlion 4 роки тому

      @@Squarecoin Awesome thanks! I'll take a deeper look!

  • @ambroosvanhaverbeke3687
    @ambroosvanhaverbeke3687 3 роки тому

    Awesome!

  • @theparkp
    @theparkp Рік тому

    you are great

  • @bettymac8248
    @bettymac8248 3 роки тому

    So this is for Windows only, no Apple?

  • @shaper_i-o
    @shaper_i-o 3 роки тому

    Any chance for this to be compatible with Unity?

  • @justin42230
    @justin42230 4 роки тому +1

    im having trouble importing it to blender

  • @AbdoEltohamy
    @AbdoEltohamy 4 роки тому

    You are the best❤

  • @emmanuelmatewere7575
    @emmanuelmatewere7575 2 роки тому

    is it possible to use your own model?

  • @SamyakAgarkar
    @SamyakAgarkar 4 роки тому

    Sir, i need help.
    I can get approximate facial landmark location data in realtime. Now i want to transfer it to blender any rig. How should i move forward?
    Should i use shape keys or rig bones?0
    And how to transfer the landmark to any rig?
    I need this for my college project!
    thanks in advance

    • @Squarecoin
      @Squarecoin  4 роки тому

      Shape keys are the ultimate point you want to end up on. You can 1) use a rig to then in turn generate shape keys. You can see me do this in my second to most recent video 2) Resculpt the example mesh to resemble your target mesh or 3) Use a solution like FaceIt for blender.

  • @mrsomeone828
    @mrsomeone828 4 роки тому

    Rob. I just don't understand one thing. I see StrongTrack capturing points of movement in a video, and then you just export a file, that's awesome - but it is not shown in the tutorial how the 3D model was set up to connect with StrongTrack. Could you please make a video explaining this process so we can make it work for our custom models? It is not clear to me how you prepared the model to receive this information.
    Thank you for giving this for free. Good work!

    • @Squarecoin
      @Squarecoin  4 роки тому +1

      The file just contains 51 values that range from 0-1 and align with 51 different shapes that a mesh can have (52 with neutral/no expression). If you look at the example blender mesh (or in unreal) you should be able to see the different shape keys/morph targets. So the tricky part is making a mesh with those 51 shapes. You can take the mesh I've provided and sculpt that to fit your own mesh (which I show in an earlier video with Captain Rex from Star Wars), create your own from scratch with a rig as seen in my previous video (pt 2 and 3 of face modelling) or you could also use something like FaceIt for Blender which is a product that came out recently that makes the same standard 51 shapes. Does that make sense?

    • @mrsomeone828
      @mrsomeone828 4 роки тому

      @@Squarecoin Kinda. It makes sense to me when I think that the points movement exported from StrongTrack then recognizes the bones that are in the model, which I see is important to have them, since you mentioned we have to build our models with the same rig as you did (plus the shapekeys). So, the points locate the appropriate bones automatically ? no need for matching their names or anything ? or it just identifies by similar position of the bones in relation to the points?
      Sorry if my questions are stupid ! I am a modeler and not an animator :D. But I am very interested in this because I plan to work on my own game in Unreal.

    • @Squarecoin
      @Squarecoin  4 роки тому +1

      The bones don't come into play with this implementation; it's all shapekey based. You can think of the shapekeys as basically just 51 different statues in a museum showing the same person pulling different expressions that we've condensed into one instance that can blend between them all. That's all it is really. Bones are an alternative way to animate a face, if you want to 100% accurately show a jaw rotating for example - but for what we're doing here, not necessary. Bones can also come into play as a way to generate the different statues but only as an intermediate step.

    • @mrsomeone828
      @mrsomeone828 4 роки тому

      @@Squarecoin Yes I know what shapekeys and bones are. At first I just didn't understand what were the requirements for the model to have such seamless match with this file you generated from StrongTrack. Now it all makes more sense. Thank you for the clarification.
      However In certain situations, people might be interested in the output animation to be driven by bones, in games for example. I read somewhere about some game studios that after the facial motion capture is done, they run a program to convert the vertex animation to bones skinned animation. If I remember well, they did that with ''Hellblade: Senua’s Sacrifice''. Studios have such programs in house as their own, interesting that I didn't find any free or even paid application that would do this. There was one though made by Hans Godard that he used at Naughty Dog, but it was kinda expensive and only for maya. Article here - lesterbanks.com/2015/04/skinning-converter-for-maya/
      Just saying in case you might be interested.
      But hey! what you already did is already fantastic and for free!! Thank you again for this!

    • @Squarecoin
      @Squarecoin  4 роки тому +2

      Very interesting papers! I'll have to look into the subject more, though I've always personally been a bit unsure about the advantage of using bones as opposed to morph targets in facial rigging (outside of the eyes and perhaps the jaw of course). Is it considered more performant?

  • @Utubewonderful
    @Utubewonderful 4 роки тому

    Hi Rob, I'm python intermediate coder and having passion in VFX and starting investing time in learning blender. can I join with you for developing this project. I was planning to create the same type of project so after seen your I'm so excited. Please let me know, thanks.

    • @Squarecoin
      @Squarecoin  4 роки тому

      Well there's always the github repo if you want to look through that and see what you make of it. That's the beauty of open source :)

  • @EternalKernel
    @EternalKernel 3 роки тому

    Nice :D

  • @jonos138
    @jonos138 4 роки тому

    Great work. Can you make it work for lightwave 3d??

  • @sanjaylama4882
    @sanjaylama4882 4 роки тому

    I dont know how many have tried...but when I tried it was very difficult to follow the tutorial. First I was not able to install from the exe file and there was no file to see what was exception. So, I gave up trying to run from exe. However, I was able to run from anaconda. Major issue i ran was that, i didnt know where to exactly put the 51 key points. If you could provide the accurate positions of all 51 keypoints in image with description (e.g. 23 point should be in mid of nose and so on.) that would be really helpful. Coz, when i was tracking the positions in inner lips somehow i had 4 points in upper lips and 2 points in lower lips. Moving track points are also not very intuitive...it took me a while to figure out Rt mouse click would move whole group.
    It was really troublesome to record own video and put extreme poses. First of all, how many extreme poses are required or what can be defined as extreme poses? It would be really helpful if there was a sample video to try out and later after being comfortable user can record a video.
    One thing to clear out should be when to press F and T and when to Press W and N. I dont think anyone without prior knowledge to AI will understand , he has to train models after the landmark is set.
    Also, will it improve significantly if we have 51 or atleast (eye brows/ nose/ mouth trackers) in our face while recording?
    After a long painful process, the tracking was really good but then I was not able to save the file. Execption is as below:
    File "strongtrack.py", line 854, in export
    mouth_coeffs, brow_coeffs, _, _ = decomp.findCoeffsAll(points,self.keyposes, self.keydrops)
    File "strongtrack-0.7\0.7\decomp_functions.py", line 224, in findCoeffsAll
    shiftedPosesMouth = shiftKeyPoses(width_points, mouth_centre, keyposes_mouth, 'mouth')
    File "strongtrack-0.7\0.7\decomp_functions.py", line 258, in shiftKeyPoses
    width_keypose = (keyposes[0][16][0]-keyposes[0][0][0])
    IndexError: index 0 is out of bounds for axis 0 with size 0
    But, It seems really promising application that you are building sir. Thank you very much.

    • @sanjaylama4882
      @sanjaylama4882 4 роки тому

      NVM ....i didn't set keyposes ....it really fits greatly on the provided sample blender project....CHEERS!!!! but there is flickers on mouth movements.

    • @Squarecoin
      @Squarecoin  4 роки тому +1

      Thanks for your feedback! Being a pre-release of something put together by an amateur it is very buggy....as I hope I'm always trying to make clear. It's possible the flickering mouth movements are being caused by something that I'm working on fixing as we speak. And yep, there's a lot of quality of life improvements to be made with UI such as - as you say - indicating best layout. As the weeks/months roll on things will be improving.

    • @Squarecoin
      @Squarecoin  4 роки тому +1

      Also: sorry to hear it was a long and painful process. Overcoming the fiddly-ness of the landmark placements is one of the reasons I'm looking to assemble a copyright-zero dataset and, based on your comment, I'll probably pivot toward spending more time on that.

  • @wahilboulahbach2083
    @wahilboulahbach2083 2 роки тому

    Someone knows how to do this into Maya ?

  • @worldcreatures2223
    @worldcreatures2223 2 роки тому

    i can used in windows 7 bit32????

  • @Themagiciangr
    @Themagiciangr 4 роки тому

    Can this be used for grease pencil rig

  • @mugyoung5726
    @mugyoung5726 3 роки тому

    damn coollllllll!

  • @davidcripps3011
    @davidcripps3011 4 роки тому +1

    Has anyone else had an error loading Dlib at line 17 (file strongtrack.py) when run the windows 10 64bit executable? Dlib is installed in my python directory.

    • @Squarecoin
      @Squarecoin  4 роки тому +1

      Huh. That's weird. I've been testing it on friend's machines and virtual machines with just blank installations of windows 10 and no sign of issues. Suppose I might need to rethink how to package it up. Sorry about that!

    • @davidcripps3011
      @davidcripps3011 4 роки тому

      @@Squarecoin Rob, you've got better things to get on with. It's probably my setup. I was just hoping some others following your work would have a suggestion. Don't do anything unless others have the same issue :-)

    • @davidcripps3011
      @davidcripps3011 4 роки тому

      @@Squarecoin I've found out that elementree didn't install properly so I'll work on finding out why. Don't put any time into helping on this, Rob. I'm sure it will be something I haven't installed properly

  • @eledikohabib3369
    @eledikohabib3369 4 роки тому

    Hi Rob...this is a great...I want to..I have wanted to implement this kind of face mocap in a raspberry pi motion capture suit that am working on...tried alot...and boom only to come accross your video...
    Please can you tell me the use of the
    1).base.npy
    2).base_face.npy
    In the /data folder of the sourcecode... thanks in advance

    • @Squarecoin
      @Squarecoin  4 роки тому

      Hi. Base face is just the default position of the face dots if you haven't begun positioning them yet. So just an array of 2d co-ordinates. It could probably be rolled into the main scripts tbh but I haven't got around to doing it yet. It's an array that's used when taking an arbitrary number of coefficients of different key poses and converting it to an array of the 50 or so morph targets. Super simple stuff but the words escape me at this time, but say you have an expression of 0.5 smile, 0.7 jaw open. The final data that's saved can't just be those two numbers because you're streaming to a model that has 50 different shapes. So with base.npy you end up with a full list of all 50 or so shapes but with smile and jaw being 0.5 and 0.7 respectively within that, while the other 48 are 0.0. It's a bit over engineered for now but it will be useful when version 0.9 or 1.0 is released where greater control over all the 50 of the shapes can be tied to the coefficients more flexibly.

  • @NikhilKhairwal
    @NikhilKhairwal 3 роки тому +1

    #MEGAGRANTFORROBINMOTION

  • @OzzyMariam
    @OzzyMariam 4 роки тому

    Can I use this with any model....rig.

    • @Squarecoin
      @Squarecoin  4 роки тому

      In theory yeah....it has to have the corresponding shape keys (or morph targets). The example files contain such a mesh which you could adapt to a sculpt or you could look into a tool such as FaceIt for Blender which helps quickly create the right array of keys. It's the same process as shown in the previous video I just put out where you rig a face and then bake keys.

  • @skilatgamedev3353
    @skilatgamedev3353 3 роки тому

    Really nice ! That's what I'm looking for ! I'll try it soon with meta human. I tried one but I give up because it was 3 software to sync and then retarget on character in maya (Why maya ?? So stupid and why retarget with a soft, your way is so much better

    • @VIRUSMUSICc
      @VIRUSMUSICc 3 роки тому

      Hey mate did it work with meta humins?

  • @GameDevUE
    @GameDevUE 4 роки тому

    Does Andriod work with your app? 😊

    • @Squarecoin
      @Squarecoin  4 роки тому +1

      It would in the sense that it just uses regular video so anything that can record video works. It doesn't work in real time but currently this program is about taking recorded video

    • @GameDevUE
      @GameDevUE 4 роки тому

      @@Squarecoin that's perfect! Recording works for me! Thanks for this amazing app Rob!

  • @legalwaffles612
    @legalwaffles612 3 роки тому +1

    i poop