iPhone X Facial Capture test PART 4 - GAME CHARACTERS

Поділитися
Вставка
  • Опубліковано 5 лис 2024

КОМЕНТАРІ • 72

  • @MultiPuPoo
    @MultiPuPoo 6 років тому +13

    This is CRAZY!!! Cant wait for more of this !

  • @freenomon2466
    @freenomon2466 4 роки тому +2

    Inspired by you Ive been making my own facial mocap solutions! Thank you dude! Will have it for different platforms. You are my hero!

  • @grahamulax
    @grahamulax 6 років тому +7

    WHAT! I've been theorizing this since the X came out that it would be so cool to do for game dev or motion work. I KNEW someone would get it and of course it's you guys at kite & lightning! Do you ever think someone will release a guide on this kind of stuff or will I always be dreaming for an easy export. :p

  • @SuenJason
    @SuenJason 6 років тому

    Wooo, better and more fun this time!

  • @buzzedlunne6668
    @buzzedlunne6668 6 років тому +4

    mind blowing... is this concept usable for creating short animation films too?

  • @ace5
    @ace5 5 років тому

    wow this is amazing! great work! will follow to see where this goes

  • @Bingeworthy
    @Bingeworthy 6 років тому +1

    Just watched WWDC and thought of you and these babies! Cant wait to see what you end up with and wonder if you and apple have a similar vision.

    • @trickdiggidy
      @trickdiggidy 6 років тому +1

      Thanks! I'm excited to see what the face tracking improvements are and can't help but keep wanting to make a Bebylon app so peeps can build their games beby character and use them as their memoji!

  • @redrubi9223
    @redrubi9223 3 роки тому

    I love this guy

  • @avtpro
    @avtpro 6 років тому

    Aweseome. I would need if you can publish some of the work for others to follow like the ZBrush/Apple Targets. However, thanks for sharing all the great WIP info. looks great.

    • @trickdiggidy
      @trickdiggidy 6 років тому

      thanks man. You can download the apple targets from our blog. blog.kiteandlightning.la/iphone-x-facial-capture-apple-blendshapes/

  • @Drahoslav_Lysak
    @Drahoslav_Lysak 6 років тому +2

    Hyper Amazing!

  • @daybrick2513
    @daybrick2513 6 років тому +1

    aweeeeeesome job man! cant wait for the next video. thanks!

  • @teddybotssofttoysandgames9700
    @teddybotssofttoysandgames9700 6 років тому +1

    Epic work!!

  • @animian
    @animian 6 років тому +2

    man this is pritty cool! gonna look out for the siggraph presentation:) btw can u shortly give a tip on how u generate the iphone blendshapes to the alien guy or baby? what tool in maya u use? like just skin painting or wrap?shrinkwrap?cheers!

    • @trickdiggidy
      @trickdiggidy 6 років тому

      Thanks Stian! yup, in Maya i used wrap and Delta mush to generate blendshapes. Before that stage I used shrink-wrap as well to help fit the neutral pose to the alien face.

  • @chrisgreenwell3404
    @chrisgreenwell3404 6 років тому +1

    Sooo much awesomeness!!

  • @cool24a
    @cool24a 5 років тому

    You are a genius!! This is awesome! Love it!

  • @eleanorlee2833
    @eleanorlee2833 6 років тому

    please release it on a kickstarter, etc. I will def buy it

  • @PierreDaGreatProductions
    @PierreDaGreatProductions 4 роки тому

    Impressive

  • @blakeXYZ
    @blakeXYZ 6 років тому +4

    Great work, it's nice seeing your development and milestones you're reaching for. Your passion and hard work really shows. Albeit the baby narration is creeping me out haha. What game engine are you feeding the live mocap data to?

    • @trickdiggidy
      @trickdiggidy 6 років тому

      lol, hopefully using real performers and bebyfied voices will reduce the creep factor! we're using UE4 for our game though i haven't got live streaming going yet (using maya as the middleman). I think UE4 4.19 will make live streaming the data easier to implement so stay tuned!

  • @nafkicreations9340
    @nafkicreations9340 6 років тому +1

    This so Awesome

  • @davoodkharmanzar4881
    @davoodkharmanzar4881 2 роки тому

    hi ...
    amazing work ;]
    where can i buy that helmet?
    what is that helmet model?
    thanks.

    • @xanaduBlu
      @xanaduBlu  2 роки тому

      Standard Deviation is the helmet im using now… its super awesome

  • @KRGraphicsCG
    @KRGraphicsCG 6 років тому +1

    This is impressive. And very clean too. As for your helmet, do you have access to a 3d printer?

  • @Super-id7bq
    @Super-id7bq 6 років тому +1

    Dude your work mocap is incedible but I'm mostly impressed at how stunning this looks! :O Is this all in engine in realtime?

    • @trickdiggidy
      @trickdiggidy 6 років тому

      Thanks man! This video is rendered in vray but the last short one i posted was rendered in Realtime in Unreal, which is shockingly close to vray! Sadly UA-cam compresses these things so much a lot of the sweet details get lost but its pretty amazing seeing such a high quality subsurface rendering in Unreal.

    • @Super-id7bq
      @Super-id7bq 6 років тому

      Wow dude! The Unreal one actually looks amazing! I Can't wait to see more from you dude, you're doing amazing work, especially with the iPhone for face cap. I popped into the animation studio for Star Citizen in the UK and got to check out what they were doing with Faceware. They had pretty made tracking markers redundant because they were getting such amazing tracking right out the box without any markers at all but it's awesome to see that you are managing pretty much the same thing in real time on a damn phone haha! I also had the pleasure of being scanned into the game and had to go through all those blend poses myself. Probably the most unflattering thing I've ever done :D

    • @Super-id7bq
      @Super-id7bq 6 років тому

      Also, I may have missed this but how are you tracking your hands in real time with so many bones for full articulation? When you were throwing metal signs I was impressed by how accurate it was.

  • @erictuck2505
    @erictuck2505 6 років тому +2

    Great Work! What helmet are you using for this? I would love to try this out too.

    • @trickdiggidy
      @trickdiggidy 6 років тому

      Its a cheap paintball helmet, and not very comfortable but gets the job done. heres the helmet details uploadvr.com/iphone-xsens-performance-capture-bebylon/

  • @jungchoi5834
    @jungchoi5834 6 років тому +2

    incredible. is it rendered in some GPU renderer? (redshift or octane?)

  • @hetaljain5880
    @hetaljain5880 6 років тому +1

    super!!

  • @obieze3319
    @obieze3319 6 років тому +1

    Ha! "Davor"!

  • @roxaneviljoen4450
    @roxaneviljoen4450 2 роки тому

    Epppic

  • @MM4more
    @MM4more 5 місяців тому

    Did you ever manage to build a facial rig to animate ontop of the ark kit motion capture data? I'm in the same boat and i'm wonder if this is possible. Thanks

    • @xanaduBlu
      @xanaduBlu  5 місяців тому +1

      not a rig per se, i've been doing it in a janky way when i need to do little fixes. Im getting ready to migrate all my characters to full metahuman which will have new challenges.

    • @MM4more
      @MM4more 5 місяців тому

      @@xanaduBlu Good luck lol. Love the work & the new video. Look forward to seeing more.

  • @MULAMIGZ
    @MULAMIGZ 3 роки тому

    Been about the metaverse

  • @TommyGunsStudios
    @TommyGunsStudios 2 роки тому

    How much does the suit cost?

  • @444haluk
    @444haluk 3 роки тому

    What is that effect on the baby's face at 0:17?

  • @NOMADxBR
    @NOMADxBR 5 років тому

    This is amazing!
    Can you tell me the price of XSens Suit?

  • @prahaladadharsh9766
    @prahaladadharsh9766 5 років тому

    Please mention the equipment u use to do this...

  • @transcendingthegenre
    @transcendingthegenre 6 років тому +1

    2:30 I'm gonna get medieval probably

  • @hideking2299
    @hideking2299 4 роки тому

    wow!!!!

  • @sarp5567
    @sarp5567 6 років тому

    i have fullbady perception neuron system and it is not fixed working it is so bad system

  • @thatskykidfreya
    @thatskykidfreya 6 років тому

    Is this type of thing possible with later iPhone models?

    • @trickdiggidy
      @trickdiggidy 6 років тому

      At the moment the way I'm doing it, just on the iPhoneX. Technically though you can still do facial capture on any mobile with a camera (same idea as snap chat) using the camera feed versus the depth map created on the iPhone X.

  • @david.ricardo
    @david.ricardo 6 років тому +1

    you should make an app

  • @robertomorales8751
    @robertomorales8751 6 років тому +1

    Bro Duck selling the game sell the app your making 😂

  • @prahaladadharsh9766
    @prahaladadharsh9766 5 років тому

    Please list all of the things u used including softwares and equipments ...please (mention price if possible)...please anyone reply

    • @trickdiggidy
      @trickdiggidy 5 років тому +1

      Xsens Link suite, iPhoneX, Unreal Engine 4.21. Maya was used to make the beby model and Maya & zBrush for the blendshapes.
      helmet: uploadvr.com/iphone-xsens-performance-capture-bebylon/

    • @emekaeffi
      @emekaeffi 5 років тому

      @@trickdiggidy can i use the iphoneX +cinema 4d+xsens for my character mocaps?

    • @trickdiggidy
      @trickdiggidy 5 років тому +1

      I don't know enough about c4d and how it handles blendshapes and importing animation data. I'm quite sure getting xsens body capture data should be simple using FBX. @@emekaeffi

    • @emekaeffi
      @emekaeffi 5 років тому

      @@trickdiggidy please i need your advice on an animation project i'm working on, can i send you a mail?

    • @UbongaShorts
      @UbongaShorts 5 років тому

      What is the iphone app called?

  • @UbongaShorts
    @UbongaShorts 5 років тому

    What is the iphone app called?

    • @Fipacz
      @Fipacz 5 років тому

      This one is not available to the public. Take a look at MocapX app, it streams the data directly from the iPhone to Maya in real-time.

  • @danl6952
    @danl6952 6 років тому +1

    How do you use Airdrop the recorded data to the Mac?

    • @trickdiggidy
      @trickdiggidy 6 років тому

      the recorded data stores locally on the iPhone and i just use airdrop to copy the data to my desktop. I could potentially stream the data right to the desktop via wifi but haven't had a chance to dive into that.

    • @danl6952
      @danl6952 6 років тому

      Cory Strassburger your solution is much better... I’m using the plugin ofUnity to extract the data from the phone which doesn’t support “wireless”.... the phone has to be physically connected to the Mac. But what I need is your solution. Is it possible to share with me? :) Btw, I’m using Perception Neuron instead of Xsen coz it tracks every detail of my finger movement.

  • @farrahvee
    @farrahvee 3 роки тому

    Why did he stop making videos

  • @waly3302
    @waly3302 3 роки тому

    JAJAJAJAJAJA

  • @mf-h3659
    @mf-h3659 5 років тому

    Is this what misanthropy feels like?

  • @roberthansen4673
    @roberthansen4673 Рік тому

    Lol so is this brown face? Good job.