WHAT! I've been theorizing this since the X came out that it would be so cool to do for game dev or motion work. I KNEW someone would get it and of course it's you guys at kite & lightning! Do you ever think someone will release a guide on this kind of stuff or will I always be dreaming for an easy export. :p
Thanks! I'm excited to see what the face tracking improvements are and can't help but keep wanting to make a Bebylon app so peeps can build their games beby character and use them as their memoji!
Aweseome. I would need if you can publish some of the work for others to follow like the ZBrush/Apple Targets. However, thanks for sharing all the great WIP info. looks great.
man this is pritty cool! gonna look out for the siggraph presentation:) btw can u shortly give a tip on how u generate the iphone blendshapes to the alien guy or baby? what tool in maya u use? like just skin painting or wrap?shrinkwrap?cheers!
Thanks Stian! yup, in Maya i used wrap and Delta mush to generate blendshapes. Before that stage I used shrink-wrap as well to help fit the neutral pose to the alien face.
Great work, it's nice seeing your development and milestones you're reaching for. Your passion and hard work really shows. Albeit the baby narration is creeping me out haha. What game engine are you feeding the live mocap data to?
lol, hopefully using real performers and bebyfied voices will reduce the creep factor! we're using UE4 for our game though i haven't got live streaming going yet (using maya as the middleman). I think UE4 4.19 will make live streaming the data easier to implement so stay tuned!
Thanks man! This video is rendered in vray but the last short one i posted was rendered in Realtime in Unreal, which is shockingly close to vray! Sadly UA-cam compresses these things so much a lot of the sweet details get lost but its pretty amazing seeing such a high quality subsurface rendering in Unreal.
Wow dude! The Unreal one actually looks amazing! I Can't wait to see more from you dude, you're doing amazing work, especially with the iPhone for face cap. I popped into the animation studio for Star Citizen in the UK and got to check out what they were doing with Faceware. They had pretty made tracking markers redundant because they were getting such amazing tracking right out the box without any markers at all but it's awesome to see that you are managing pretty much the same thing in real time on a damn phone haha! I also had the pleasure of being scanned into the game and had to go through all those blend poses myself. Probably the most unflattering thing I've ever done :D
Also, I may have missed this but how are you tracking your hands in real time with so many bones for full articulation? When you were throwing metal signs I was impressed by how accurate it was.
Its a cheap paintball helmet, and not very comfortable but gets the job done. heres the helmet details uploadvr.com/iphone-xsens-performance-capture-bebylon/
Did you ever manage to build a facial rig to animate ontop of the ark kit motion capture data? I'm in the same boat and i'm wonder if this is possible. Thanks
not a rig per se, i've been doing it in a janky way when i need to do little fixes. Im getting ready to migrate all my characters to full metahuman which will have new challenges.
At the moment the way I'm doing it, just on the iPhoneX. Technically though you can still do facial capture on any mobile with a camera (same idea as snap chat) using the camera feed versus the depth map created on the iPhone X.
Xsens Link suite, iPhoneX, Unreal Engine 4.21. Maya was used to make the beby model and Maya & zBrush for the blendshapes. helmet: uploadvr.com/iphone-xsens-performance-capture-bebylon/
I don't know enough about c4d and how it handles blendshapes and importing animation data. I'm quite sure getting xsens body capture data should be simple using FBX. @@emekaeffi
the recorded data stores locally on the iPhone and i just use airdrop to copy the data to my desktop. I could potentially stream the data right to the desktop via wifi but haven't had a chance to dive into that.
Cory Strassburger your solution is much better... I’m using the plugin ofUnity to extract the data from the phone which doesn’t support “wireless”.... the phone has to be physically connected to the Mac. But what I need is your solution. Is it possible to share with me? :) Btw, I’m using Perception Neuron instead of Xsen coz it tracks every detail of my finger movement.
This is CRAZY!!! Cant wait for more of this !
Inspired by you Ive been making my own facial mocap solutions! Thank you dude! Will have it for different platforms. You are my hero!
WHAT! I've been theorizing this since the X came out that it would be so cool to do for game dev or motion work. I KNEW someone would get it and of course it's you guys at kite & lightning! Do you ever think someone will release a guide on this kind of stuff or will I always be dreaming for an easy export. :p
Wooo, better and more fun this time!
mind blowing... is this concept usable for creating short animation films too?
wow this is amazing! great work! will follow to see where this goes
Just watched WWDC and thought of you and these babies! Cant wait to see what you end up with and wonder if you and apple have a similar vision.
Thanks! I'm excited to see what the face tracking improvements are and can't help but keep wanting to make a Bebylon app so peeps can build their games beby character and use them as their memoji!
I love this guy
Aweseome. I would need if you can publish some of the work for others to follow like the ZBrush/Apple Targets. However, thanks for sharing all the great WIP info. looks great.
thanks man. You can download the apple targets from our blog. blog.kiteandlightning.la/iphone-x-facial-capture-apple-blendshapes/
Hyper Amazing!
aweeeeeesome job man! cant wait for the next video. thanks!
Epic work!!
man this is pritty cool! gonna look out for the siggraph presentation:) btw can u shortly give a tip on how u generate the iphone blendshapes to the alien guy or baby? what tool in maya u use? like just skin painting or wrap?shrinkwrap?cheers!
Thanks Stian! yup, in Maya i used wrap and Delta mush to generate blendshapes. Before that stage I used shrink-wrap as well to help fit the neutral pose to the alien face.
Sooo much awesomeness!!
You are a genius!! This is awesome! Love it!
please release it on a kickstarter, etc. I will def buy it
Impressive
Great work, it's nice seeing your development and milestones you're reaching for. Your passion and hard work really shows. Albeit the baby narration is creeping me out haha. What game engine are you feeding the live mocap data to?
lol, hopefully using real performers and bebyfied voices will reduce the creep factor! we're using UE4 for our game though i haven't got live streaming going yet (using maya as the middleman). I think UE4 4.19 will make live streaming the data easier to implement so stay tuned!
This so Awesome
hi ...
amazing work ;]
where can i buy that helmet?
what is that helmet model?
thanks.
Standard Deviation is the helmet im using now… its super awesome
This is impressive. And very clean too. As for your helmet, do you have access to a 3d printer?
Dude your work mocap is incedible but I'm mostly impressed at how stunning this looks! :O Is this all in engine in realtime?
Thanks man! This video is rendered in vray but the last short one i posted was rendered in Realtime in Unreal, which is shockingly close to vray! Sadly UA-cam compresses these things so much a lot of the sweet details get lost but its pretty amazing seeing such a high quality subsurface rendering in Unreal.
Wow dude! The Unreal one actually looks amazing! I Can't wait to see more from you dude, you're doing amazing work, especially with the iPhone for face cap. I popped into the animation studio for Star Citizen in the UK and got to check out what they were doing with Faceware. They had pretty made tracking markers redundant because they were getting such amazing tracking right out the box without any markers at all but it's awesome to see that you are managing pretty much the same thing in real time on a damn phone haha! I also had the pleasure of being scanned into the game and had to go through all those blend poses myself. Probably the most unflattering thing I've ever done :D
Also, I may have missed this but how are you tracking your hands in real time with so many bones for full articulation? When you were throwing metal signs I was impressed by how accurate it was.
Great Work! What helmet are you using for this? I would love to try this out too.
Its a cheap paintball helmet, and not very comfortable but gets the job done. heres the helmet details uploadvr.com/iphone-xsens-performance-capture-bebylon/
incredible. is it rendered in some GPU renderer? (redshift or octane?)
vray though not GPU.
super!!
Ha! "Davor"!
Epppic
Did you ever manage to build a facial rig to animate ontop of the ark kit motion capture data? I'm in the same boat and i'm wonder if this is possible. Thanks
not a rig per se, i've been doing it in a janky way when i need to do little fixes. Im getting ready to migrate all my characters to full metahuman which will have new challenges.
@@xanaduBlu Good luck lol. Love the work & the new video. Look forward to seeing more.
Been about the metaverse
How much does the suit cost?
What is that effect on the baby's face at 0:17?
This is amazing!
Can you tell me the price of XSens Suit?
7k - 12k
Please mention the equipment u use to do this...
2:30 I'm gonna get medieval probably
wow!!!!
i have fullbady perception neuron system and it is not fixed working it is so bad system
Is this type of thing possible with later iPhone models?
At the moment the way I'm doing it, just on the iPhoneX. Technically though you can still do facial capture on any mobile with a camera (same idea as snap chat) using the camera feed versus the depth map created on the iPhone X.
you should make an app
Bro Duck selling the game sell the app your making 😂
Please list all of the things u used including softwares and equipments ...please (mention price if possible)...please anyone reply
Xsens Link suite, iPhoneX, Unreal Engine 4.21. Maya was used to make the beby model and Maya & zBrush for the blendshapes.
helmet: uploadvr.com/iphone-xsens-performance-capture-bebylon/
@@trickdiggidy can i use the iphoneX +cinema 4d+xsens for my character mocaps?
I don't know enough about c4d and how it handles blendshapes and importing animation data. I'm quite sure getting xsens body capture data should be simple using FBX. @@emekaeffi
@@trickdiggidy please i need your advice on an animation project i'm working on, can i send you a mail?
What is the iphone app called?
What is the iphone app called?
This one is not available to the public. Take a look at MocapX app, it streams the data directly from the iPhone to Maya in real-time.
How do you use Airdrop the recorded data to the Mac?
the recorded data stores locally on the iPhone and i just use airdrop to copy the data to my desktop. I could potentially stream the data right to the desktop via wifi but haven't had a chance to dive into that.
Cory Strassburger your solution is much better... I’m using the plugin ofUnity to extract the data from the phone which doesn’t support “wireless”.... the phone has to be physically connected to the Mac. But what I need is your solution. Is it possible to share with me? :) Btw, I’m using Perception Neuron instead of Xsen coz it tracks every detail of my finger movement.
Why did he stop making videos
Who?
JAJAJAJAJAJA
Is this what misanthropy feels like?
Lol so is this brown face? Good job.