0:30, His hand slightly penetrated into the knee of his leg, and his leg when he was about to fold it overlapped through the other leg. And the lower leg also when he stepped on the ground he slightly overlapped or penetrated into the floor. But overall, it's great mocap!!! Very smooth motion, and no stiffness or error in hand, foot, or head movement.
I just want to say that you are a legend, to make this all production work with unreal and then document all of this and make videos for us is insane! Enormous effort has been put in to that and we thank you!
I would love to see a video specifically about retargeting a MetaHuman in Shogun. It would be great to stream directly from Shogun to Unreal without Motion Builder, even if it's a more complicated setup.
It's really nice to see the progress you've made and what is possible wich such a Mocap setup! Just dreaming for now but once, when I can afford it, I also want to capture selfmade animations but for games
You are a genius making explanation really accuratw and simple your hard work are way too complicated for me to accomplish but your friendly way to explain make us a part of your world you are amazing :)
Fantastic work. I've been enjoying watching you progress over the last year. Curious, there's a bit of a shoulder slump compared to you (you have straight/prominent shoulders). Is this a metahuman thing or from vicon?
It’s likely from the retarget because I have hand position constraints. So the solve may pull the shoulders down to match. It also may be an initial calibration “issue,” I didn’t recalibrate the body on this test session. Hard to say lol, it’s complicated.
This is great! Even the raw data is so much nicer than most other things I've seen. I wish Vicon (or another optical mocap system) had a solution that was
Finger tracking from Manus or StretchSense alone is $6-10K. I believe Optitrak makes a ~$15K system, but I can’t speak for the quality of the solve or tracking. I know people use them for camera tracking. You should really try an Xsens and see the data IMO before you commit. Any inertial MOCAP system will require significant clean up and their software doesn’t handle the retarget. So you would also need to have MotionBuilder or Maya in your workflow.
@@CinematographyDatabase Thanks for the response! I currently have a Rokoko SS and Gloves and albeit I'm happy with the facial capture and finger tracking I seem to have lots of issues with getting good takes with the suit itself (mostly around faster movements like dance moves, punches, etc.). It's just a lot for a one-man show!
That's so smooth and clean, is that quality comes from MH rig in Shogun, or it's extra cleaned in Mobu/SHPost? Especially fingers, i can see that there is still "extra thickness" of grabbed object - but you have achived much more presise result in terms of fingers position then i can :P (i call it that way - don't know the proper naming for that). After all - stunning result!
Shown here I’ve done an “auto” optical clean up pass in Shogun Post. The overall solve and retarget however is basically live, I didn’t do any MOCAP editing to change positions or angles of joints. The fingers are Vicon 10 finger marker set and then live retargeted to the MetaHuman hand which is a specific setup to get this positional accuracy.
Really great video the shows the possibilities of using marker based MOCAP with Unreal engine. Might be that you are using a competitive VICON system (we are on OptiTrack but work flow the same) but hey... great video, keep up the good work!!
are you using Motion Builder for you retargeting? getting the retarget to line up with the prop is the real challenge IMO which Shogun Post makes pretty straight forward.
@@CinematographyDatabase we stream from optitrack motive directly into Unreal, works like a charm and no additional retargeting needed. Lets see if we can bring up some similar demos as yours soon! ;-)
@@johanbeskow9025 does Motive have a retargeting system? I would be interested to see a similar chair interaction with a MetaHuman from the latest Optitrak Motive system.
Hi!! Great video, it is very satisfying to watch. I'm a newbee and a bit lost... haha What software and hardware are you using for tracking your body? Thanks in advanced
Awesome results ! That's my daily job to be honest but I'm mainly retargeting in Motionbuilder. Your hands are really good as well, do you drive them from the Solving skeleton or is it Unreal managing it? Like I said, Motionbuilder is my realm so I'm not used to work in any game engine, it's not part of my job. Cheers from Montréal, Canada
Is it possible an interaction of the Avatar with the soft tissue of the chair? Can you see the weight pressure of the avatar on a soft sofa for example?
This level of precision prop interaction is only possible with Vicon. Combination of Shogun Post retargeting and general stability of the optical tracking.
Hey, I'm a big fan of your work and I'm couple of weeks learning with Vicon, would you mind to help me with something? Because you are probably the most qualified person to do that and I can't find exact solution for this specific problem anywhere. I have access to older Viper Vicon set up for virtual production and I'm trying to make it work for MoCap with custom made suit with active markers with power source. It's already working but now I have problem to stream it on MetaHuman skeleton in Unreal. I tried to retarget it and it should work, but it doesn't and I'm not sure why. How exactly did you retarget Vicone Skeleton form metahumans in unreal for live capture? I don't know really why, but I'm stuck on this for a few days now, I used a basic retareting steps in UE but it only react on neck, which is moving, but rest of the metahuman don't, I don't really know what I did wrong. I actually already tried most of the Mocap systems (Xsens, Rokoko, Neuron) and been able to make them work with metahumans pretty quickly because all of them had their own official retarget UASSET and tutorials for download, but i can't find anything simillar for Vicon, and theres a really a little amount of tutorials how to work with it, because of its heavy price. Do you have any ideas what could be wrong/some tips or maybee could you share your files or made some more detailed tutorial?
You need to create a retarget from you Vicon skeleton to the desired MetaHuman skeleton in Shogun post. And then use that in Shogun Live before you live link to Unreal.
weird question I always wanted to ask... is the Iphone still the best purchase you can make to record facial? In a... n00b, do it yourself kinda way, I mean
Yeah I’m not sure there’s many “indie” people doing anything with Vicon. I don’t know anybody with $50k to spare. I love Matt’s videos but any Vicon stuff isn’t majorly useful unless you’re rich :) HOWEVER it’s nice to know what I could buy if I win the lottery! :)
It is very expensive and not only do the 50,000 count, you must have an adequate space to use them, spacious and with good lighting, a state-of-the-art equipment also to be able to move it, you can easily if you add topics such as facial capture, hands, space, lighting equipment, the price will go up to more than 80,000 dollars. a madness that only companies or entrepreneurs can afford, big AAA studios, not an indie,
Wow, it's oddly satisfying watching the side by side and how perfect the chair interaction is
Yes...this was visual ASMR ....
Love people like you who just share their knowledge with the world. If you do good, good will come back to you. All the best to you!
This is some of the best use of mocap, it loses the uncanny valley of prop interaction that wrecks the matrix so much.
0:30, His hand slightly penetrated into the knee of his leg, and his leg when he was about to fold it overlapped through the other leg. And the lower leg also when he stepped on the ground he slightly overlapped or penetrated into the floor. But overall, it's great mocap!!! Very smooth motion, and no stiffness or error in hand, foot, or head movement.
Is absolutely mind blowing how far can a single person go on creating these mocaps. Technology is power!
triple A game quality animations! Bravo!
Nice work. I love that you can see the chair sink into the carpet slightly when you sit on it, amazing fidelity. Jokkmokk strikes again!
Your chair flicking and spinning game is on point Matt very impressed. I'd hire you as a mocap actor in a heartbeat
for better or worse chair MOCAP is part of my CV now
@@CinematographyDatabase it's gonna be the climax in your showreel :0
Looks really good Matt! Good work! Thank you for sharing the process.
Just want to acknowledge and thank you for making all these knowledge free for us! Doing great work!
Only issue I can see is the feet but this is just top notch, looks amazing and makes me excited for the future.
Wow Vicon is legit! Keep up the great content!
Great stuff! That's some really well synced mocap
nothing like an intro chair hug
This is so impressive! I love it!
Thanks for this great information 😊
I just want to say that you are a legend, to make this all production work with unreal and then document all of this and make videos for us is insane! Enormous effort has been put in to that and we thank you!
Thank you with all my heart for following and supporting me 💝
Это что-то невероятное О_о просто потрясающе! ^___^
amazing work, Thank so much
this is amazin would like to see more stuff like this with mocaps
This is SO cool. Thank you for sharing!
Absolutely amazing! Very clean capture! I commend you!
oh i missed this! super cool
Great overview of your workflow, keep up the great work!
I would love to see a video specifically about retargeting a MetaHuman in Shogun. It would be great to stream directly from Shogun to Unreal without Motion Builder, even if it's a more complicated setup.
Yeah my earlier videos were all live recorded Shogun directly into UE. For a Biped/MetaHuman it works perfectly.
This is awesome work man
Awesome Matt! Thanks for sharing
It's really nice to see the progress you've made and what is possible wich such a Mocap setup! Just dreaming for now but once, when I can afford it, I also want to capture selfmade animations but for games
this is really nice! well done
Amazing stuff man, keep it up!
You are AMAZING. Thank you
You are a genius making explanation really accuratw and simple your hard work are way too complicated for me to accomplish but your friendly way to explain make us a part of your world you are amazing :)
10 years from now vr is going to be very popular
This video was super cool
It's super amazing...👍
Thank you.. 🙏
Better interaction than most games.
It's super amazing and very helpful!!
Thank you!
Holy Moly, very impressive!
Fantastic work. I've been enjoying watching you progress over the last year.
Curious, there's a bit of a shoulder slump compared to you (you have straight/prominent shoulders). Is this a metahuman thing or from vicon?
It’s likely from the retarget because I have hand position constraints. So the solve may pull the shoulders down to match. It also may be an initial calibration “issue,” I didn’t recalibrate the body on this test session. Hard to say lol, it’s complicated.
this is mind blowing !
This is great! Even the raw data is so much nicer than most other things I've seen. I wish Vicon (or another optical mocap system) had a solution that was
I'm pretty sure that XSens also charges annual subscriptions to use their software and cloud-based clean-up... on top of the price of the suit.
Finger tracking from Manus or StretchSense alone is $6-10K.
I believe Optitrak makes a ~$15K system, but I can’t speak for the quality of the solve or tracking. I know people use them for camera tracking.
You should really try an Xsens and see the data IMO before you commit. Any inertial MOCAP system will require significant clean up and their software doesn’t handle the retarget. So you would also need to have MotionBuilder or Maya in your workflow.
@@CinematographyDatabase Thanks for the response! I currently have a Rokoko SS and Gloves and albeit I'm happy with the facial capture and finger tracking I seem to have lots of issues with getting good takes with the suit itself (mostly around faster movements like dance moves, punches, etc.). It's just a lot for a one-man show!
The chair is paid actor
Fantastic.
That's so smooth and clean, is that quality comes from MH rig in Shogun, or it's extra cleaned in Mobu/SHPost? Especially fingers, i can see that there is still "extra thickness" of grabbed object - but you have achived much more presise result in terms of fingers position then i can :P (i call it that way - don't know the proper naming for that). After all - stunning result!
Shown here I’ve done an “auto” optical clean up pass in Shogun Post. The overall solve and retarget however is basically live, I didn’t do any MOCAP editing to change positions or angles of joints.
The fingers are Vicon 10 finger marker set and then live retargeted to the MetaHuman hand which is a specific setup to get this positional accuracy.
Incredible.. 🤯
Incredible
this is... freaking inspiring
That is great
the future of cinema
We are the metaHumans of another humans.
So cool. Would be fun to mess with.
That is amazing! wow I am so... wow. This is awesome.
Thanks for sharing your workflow
you forgot the music at the end LOL but this was a really cool video!
I really really apriciate your work & its great watching you and your explenation. Thanks
super dope
WaW, amazing
This is what i want to learn
sheesh that is good!
Insane
Next level
Really great video the shows the possibilities of using marker based MOCAP with Unreal engine. Might be that you are using a competitive VICON system (we are on OptiTrack but work flow the same) but hey... great video, keep up the good work!!
are you using Motion Builder for you retargeting? getting the retarget to line up with the prop is the real challenge IMO which Shogun Post makes pretty straight forward.
@@CinematographyDatabase we stream from optitrack motive directly into Unreal, works like a charm and no additional retargeting needed. Lets see if we can bring up some similar demos as yours soon! ;-)
@@johanbeskow9025 does Motive have a retargeting system? I would be interested to see a similar chair interaction with a MetaHuman from the latest Optitrak Motive system.
Freaking Awesome!!!👌👌👏👏🙏
The black shirt guy is amazing in copying the white shirt guy.
Good job
7:15 So where's the music bro (JK
Great stuff ! I like how clean it is, like , JESUS THAT'S CLEAN AF
the song I used was getting copyright claimed and causing issues on UA-cam and TikTok, so I had to cut it sadly 😑
Nice, last will be the facial expression.
I was born too early, I wish I was born 10 years later so I could make this my main study now.
Cool🔥
Hi!! Great video, it is very satisfying to watch. I'm a newbee and a bit lost... haha What software and hardware are you using for tracking your body? Thanks in advanced
I'm using Vicon hardware and software. You can see me setting up the system in this video - ua-cam.com/video/oCXVdTjN5Lw/v-deo.html
Incredible! Very inspiring!
Insane 🔥....Do u planing to create some cinematic or something?)
these animations will go into my app/game Cine Tracer, it’s a storyboarding app for filmmakers
Awesome results ! That's my daily job to be honest but I'm mainly retargeting in Motionbuilder. Your hands are really good as well, do you drive them from the Solving skeleton or is it Unreal managing it? Like I said, Motionbuilder is my realm so I'm not used to work in any game engine, it's not part of my job.
Cheers from Montréal, Canada
Fun. I did the same exact process for setting up virtual cameras and weapon props on CoD :D
Cool
Я почти ничего не поняла из сказанного, но выглядит это просто шедеврально! Супер! Браво, отличная работа!
Unfortunately, we don't have Ikea in my country. So all of this is impossible.
no meatballs 🥲
WOW
Is it possible an interaction of the Avatar with the soft tissue of the chair? Can you see the weight pressure of the avatar on a soft sofa for example?
Hey, you forgot to play the music! Great work though!!!
300K subscribers and 671 views. Alphabet is so stupid. Thanks for the quality content!
Nah, this channel has had many re brands and directions. MOCAP like this is niche.
дуже класно!
best realistic mocap ever... but Vicon is too expensive. Can we achieve similar results of realism with a suite ?
This level of precision prop interaction is only possible with Vicon. Combination of Shogun Post retargeting and general stability of the optical tracking.
Mocap win flawless victori
Looks like face/eye-tracking is the final piece.
7:15 Hey! You promised music! ;)
NO ONE else is saying it but i have mocap suit envy, i really really do
"Welcome to matrix!" 😧
Is this mean that we don't need the department of animation anymore very soon?
Hey, I'm a big fan of your work and I'm couple of weeks learning with Vicon, would you mind to help me with something? Because you are probably the most qualified person to do that and I can't find exact solution for this specific problem anywhere. I have access to older Viper Vicon set up for virtual production and I'm trying to make it work for MoCap with custom made suit with active markers with power source. It's already working but now I have problem to stream it on MetaHuman skeleton in Unreal. I tried to retarget it and it should work, but it doesn't and I'm not sure why. How exactly did you retarget Vicone Skeleton form metahumans in unreal for live capture? I don't know really why, but I'm stuck on this for a few days now, I used a basic retareting steps in UE but it only react on neck, which is moving, but rest of the metahuman don't, I don't really know what I did wrong. I actually already tried most of the Mocap systems (Xsens, Rokoko, Neuron) and been able to make them work with metahumans pretty quickly because all of them had their own official retarget UASSET and tutorials for download, but i can't find anything simillar for Vicon, and theres a really a little amount of tutorials how to work with it, because of its heavy price.
Do you have any ideas what could be wrong/some tips or maybee could you share your files or made some more detailed tutorial?
You need to create a retarget from you Vicon skeleton to the desired MetaHuman skeleton in Shogun post. And then use that in Shogun Live before you live link to Unreal.
😍
How did you get the basic metahuman mesh inside shogun? I can’t get it to work🤔. Which file did you use
weird question I always wanted to ask... is the Iphone still the best purchase you can make to record facial? In a... n00b, do it yourself kinda way, I mean
What’s the most budget way to get into mocap? In terms of hardware and software?
What kind of skeleton do you use for Matehuman? My skin is a mess when I enter Shogun Post
This set up 50k. Do u ever think it will come down in price so us normal creatives can buy it :(
Yeah I’m not sure there’s many “indie” people doing anything with Vicon. I don’t know anybody with $50k to spare. I love Matt’s videos but any Vicon stuff isn’t majorly useful unless you’re rich :) HOWEVER it’s nice to know what I could buy if I win the lottery! :)
@@kthmtchll I guess so. But with a budget you can just rent it and Studio space for capture, plus the team you need to make to actually work.
It is very expensive and not only do the 50,000 count, you must have an adequate space to use them, spacious and with good lighting, a state-of-the-art equipment also to be able to move it, you can easily if you add topics such as facial capture, hands, space, lighting equipment, the price will go up to more than 80,000 dollars. a madness that only companies or entrepreneurs can afford, big AAA studios, not an indie,
Интересно!) 👍
You got the Motion Capture suit with the Unreal Engine 4 logo on it?
So you went from your original look to an avatar that eerily resembles Jussie Smollett? 🤣