Це відео не доступне.
Перепрошуємо.

Next Level MOCAP Tech

Поділитися
Вставка
  • Опубліковано 16 сер 2024
  • Breaking down 5 real time Unreal Engine motion capture demos done with an indie Vicon System, MetaHumans, and props. All of the footage shown here is 100% raw MOCAP data streamed from Vicon Shogun into Unreal Engine 4.27 and screen recorded in OBS.*
    *one clip was recorded using an Atomos Sumo hardware recorder instead of OBS
    Optical MOCAP is more expensive than inertial MOCAP (Rokoko/Xsens) and is normally used in very large high end VFX and game studios. This is "indie" in that it's a very small volume being solo operated by one technician who is also the performer.
    Workstation 1
    - HP Z8, Dual Xeon Gold, 192 GB RAM, Nvidia A6000
    - Unreal Engine 4.27 and Shogun Live 1.7
    Workstation 2
    - Custom PC, AMD 1950x, 32 GB RAM, Nvidia 2080ti
    - OBS
    - Atomos Sumo
    Retargeting to MetaHuman is done live in Shogun Live, not using Motion Builder.

КОМЕНТАРІ • 92

  • @CinematicCaptures
    @CinematicCaptures 2 роки тому +24

    Every test you do with this stuff blows my mind man.

    • @LammyGaming
      @LammyGaming 2 роки тому

      Yo! Taking your course on virtual film making!

    • @CinematicCaptures
      @CinematicCaptures 2 роки тому +1

      @@LammyGaming Great to hear man!

  • @WilliamFaucher
    @WilliamFaucher 2 роки тому +6

    This is some top notch work, Matt! Well done!

  • @WoofWoofWolffe
    @WoofWoofWolffe 2 роки тому +6

    I was so impressed and happy to see your lightsaber demo! Absolutely groundbreaking what you’re doing

  • @robertdouble559
    @robertdouble559 8 місяців тому

    Dude. DUDE. So freakin good! I'm super late to the party, but happy I found it in the end!

  • @realnickg
    @realnickg 2 роки тому +2

    The object tracking really sets this system apart.

  • @Liquidamp
    @Liquidamp 2 роки тому +3

    Hey great demos of mocap. When I worked at Microsoft back in 1995 we had some of the very first magnetic mocap systems attached to Silicon Graphics workstations, running Softimage 3D, I had all the 5ft wires attached to mocap boxes, was messy and it had only 4 and 8 points per box you could capture with, it used Velcro straps attached to the wire points to your wrist, arms, and legs. You had to stay as far away from desks or metal for it to track well. Mocap sure has come a long way. It would be interesting if you could try and mocap, crawling on your belly thru mud or dirt and grass with bobbed wire above you, as you crawled like a soldier in training. Also would be interesting if you could use cardboard sides taped on the side of the cube, a large fridge box cut up with doors, that could open and close, put the marker on the cardboard door that could mimic a car side, use it for opening and closing a car door. Sitting in car for dialog. Could add more boxes for 4 avatars sitting in car with you. Love your videos thanks.

  • @HerTouacha
    @HerTouacha 2 роки тому +2

    Inspiring as always! I’m on a long path to get there, life has a way with detours but I’m not giving up the dream, still following all the great work you do Matt!

  • @itsjxomusic
    @itsjxomusic 2 роки тому +1

    This is so amazing!! I can't wait to try mocap at university for my degree! One thing I cannot wait to get my hands on!

  • @virtual_intel
    @virtual_intel 2 роки тому +1

    May the MetaVerse be with you 🪄🖱

  • @user-vp3hy2zc8n
    @user-vp3hy2zc8n 2 роки тому +1

    Man i wanna say that u are doing great stuff. Keep pushing that crazy thing further.

  • @FeedingWolves
    @FeedingWolves 2 роки тому +2

    Next level, as always!

  • @TeflonSega
    @TeflonSega 2 роки тому

    “If im a normal person “ .. felt that lol

  • @thomasgebauer2745
    @thomasgebauer2745 2 роки тому +1

    Awesome 😎 Thanks for sharing

  • @MikeMo1992
    @MikeMo1992 2 роки тому +1

    Dude, thats amazing!!!

  • @hiskishow
    @hiskishow 2 роки тому +1

    This is incredible

  • @zackcool6080
    @zackcool6080 2 роки тому

    So freaking amazing man you make difficult things so easy man god bless you

  • @daviddelayat-dnapictures
    @daviddelayat-dnapictures 2 роки тому +1

    It's amazing!
    Something I'd love to see is to add weight to your weapons, because it seems like the meta handles toys, I'd love to see it feel the weight of it!
    Thanks for your test sharing with us :)

    • @daviddelayat-dnapictures
      @daviddelayat-dnapictures 2 роки тому

      Also I'd like to have your thoughts on what would be the best system to handle finger tracking!

    • @CinematographyDatabase
      @CinematographyDatabase  2 роки тому +1

      Yeah I was considering getting a gun that has simulated recoil, but they are a little too realistic to be around my kids. They are for police training etc.
      I think an actually heavy omega hammer would be hilarious.

    • @CinematographyDatabase
      @CinematographyDatabase  2 роки тому +1

      Yeah I have StretchSense gloves I can compare to Vicon 10 Finger optical hand solves. They are very different.

    • @daviddelayat-dnapictures
      @daviddelayat-dnapictures 2 роки тому

      @@CinematographyDatabase yeah it'd be! For the weapons you could scratch some sand bags if you seek weight only! Recoil could be great, but if facial expression or the rest of the body doesn't react like for a real weapon, it wouldn't work.
      Thanks for the tip regarding finger tracking!

  • @98765theyoutuization
    @98765theyoutuization 2 роки тому +1

    This is the way !

  • @ChritsianBucic
    @ChritsianBucic 2 роки тому

    Excellent!

  • @Sergiosvm
    @Sergiosvm 2 роки тому +2

    You have an iphone, use the lidar scanner to scan the chair

    • @CinematographyDatabase
      @CinematographyDatabase  2 роки тому

      Yeah I haven’t tried phone photogrammetry yet, what all do people use?

    • @sweet2k4
      @sweet2k4 2 роки тому +1

      @@CinematographyDatabase I dont think that the Lidar on the phone would give you high enough resolution for a good result. But photogrammetry would work well. I used Meshroom for these kinds of things, its free and pretty easy to use.

  • @cool24a
    @cool24a 2 роки тому +1

    Awesome! Plus game changer!

  • @user-ik8vy1rg8f
    @user-ik8vy1rg8f 2 роки тому

    Crazy that this channel gets so few views when your other accounts get so many. Great content!

    • @CinematographyDatabase
      @CinematographyDatabase  2 роки тому

      I let this channel kind of die a long time ago and moved to Instagram mainly and prioritized short form content. But for content like this about very technical subjects I want to do longer form again

  • @gamehatchers
    @gamehatchers 2 роки тому +1

    You're a pioneer bro!

  • @TIEVR
    @TIEVR 2 роки тому

    to get a better chair model... probably a free iphone 12 photogrammetry app would give you an excellent reference 3d model to start with. Even something worth cleaning up and use straight up.

    • @CinematographyDatabase
      @CinematographyDatabase  2 роки тому

      yeah I need to learn one of the mobile photogrammetry apps, I'm used to using a DLSR and Reality Capture

  • @wsterlingy
    @wsterlingy 2 роки тому

    Another great video! Thanks. Would love to know more about the 10 finger tracking. Of course, would like to see how it goes with two actors.

  • @PHATTrocadopelus
    @PHATTrocadopelus 2 роки тому +1

    Spectacular!!

  • @chosenonemedia8337
    @chosenonemedia8337 2 роки тому

    Dude, you rock!!! Kepp it going!

  • @Mowgi
    @Mowgi 2 роки тому

    Living the dream

  • @akisarkiniemi1246
    @akisarkiniemi1246 2 роки тому +1

    This is so coooool !! !!

  • @Mowgi
    @Mowgi 2 роки тому

    Would be cool to see some photogrammetry models of the real world item

    • @CinematographyDatabase
      @CinematographyDatabase  2 роки тому +1

      yeah I might spend the time to do that with my chair and like a couch model that I'd eventually use in Cine Tracer

  • @renanmgs
    @renanmgs 2 роки тому +1

    This is just amazing! Here in my company we will probally get a Vicon system and i will have to do all those stauff to work with Unreal for live performances, i might have to get a little help for retargeting everything right, where did you got the information necessary for this? Amazing video!

  • @tigerplay930
    @tigerplay930 2 роки тому

    I'M IN SHOK!! WOW! Shhhheeeet!!!

  • @patrickmalec8419
    @patrickmalec8419 2 роки тому +1

    This technology is cool and groundbreaking, but the CGI being used is still not good enough for replicating humans and all the intricacies of our movement and micro expressions. As far as I’m concerned this is really only useful for pre-production/planning things out for a project. In the next 5-10 years it’ll probably get properly used beyond set extension like in The Mandolrian, but for now I would only use this for a more VFX/CGI heavy project. Otherwise the expense of making a whole studio like this isn’t worth the time and money, even though it isn’t hugely expensive. I’d rather just explain it to people and draw on paper/digitally, or even use stock photos or videos to get my ideas across. That being said, groundbreaking work is being done here and I’m all for it!

  • @jjste2804
    @jjste2804 2 роки тому +2

    Looks great… the only thing is that the shoulders look a little low and awkward or the dudes traps are too big

    • @CinematographyDatabase
      @CinematographyDatabase  2 роки тому

      very perceptive, the shoulders are a trade off when I want positional accuracy on the hands. Some times the hands will pull down the shoulders, it's something I can start to address with the retargeting. The traps and neck are also a bit distorted from the retarget/calibration. The third major problem is the elbows when I face my palms up. ALSO, my heels aren't always touching the ground correctly. All of these little things take time to sort out between marker placement, calibration, and retargeting. Also many of the props alignments were very hastily setup.

    • @jjste2804
      @jjste2804 2 роки тому

      @@CinematographyDatabase that’s great… amazing progress so far

  • @dannyvalimaki
    @dannyvalimaki 2 роки тому +3

    Damn, this is a step up from inertial suits. I'm actively trying to convince myself I don't need this system lol. Once the system has been set up, how much time does it take to get going at the beginning of the day? Do you need to recalibrate throughout the day? Just curious about how much time of the day is dedicated to the actual performance vs how much time is used for technical stuff. Thanks for sharing!

    • @CinematographyDatabase
      @CinematographyDatabase  2 роки тому +3

      I have a video on operating it on the channel. You calibrate it at the beginning of the session, takes like 3 minutes of waving a wand. Then you suit up and calibrate, like 10 minutes. Then you are good to go. Setting up props takes time but once you do it, the system will remember them.

    • @colibristudio3936
      @colibristudio3936 2 роки тому

      @@CinematographyDatabase i think the biggest advantage is that in comparison to other mocap systems you can add multiple objects without much effort (apart from modelling them later in Maya), AND the high framerate capture is just amazing.

    • @CinematographyDatabase
      @CinematographyDatabase  2 роки тому +1

      @@colibristudio3936 yeah I haven't shown recording the MOCAP, processing cleaning, editing etc. yet but it records natively at 120 FPS. So I end up baking SUPER clean 60 fps animation assets for Unreal Engine. It's a lot of data though.

  • @34zporlier10
    @34zporlier10 2 роки тому +1

    Without trying to add too much to your work load. Would you be able to document or talk a little more some more issues/ complications you ran into? I know it's a bit odd but i feel like its almost more beneficial to know things to avoid lol

  • @davandstudios
    @davandstudios 2 роки тому

    Sweet

  • @alanjosephproductions
    @alanjosephproductions 2 роки тому

    Love it! 👍

  • @reddcube
    @reddcube 2 роки тому

    I'm interested in the proper placement of tracking dots. If you wanted to include a table in your scene, where would you place the dots.

  • @mylesdb
    @mylesdb 2 роки тому

    Hello Matt, Fellow Epic VP Fellowship alumni here! The Deaf community have been trying to setup some Vtuber Metahumans with full hand tracking (for Sign Language). Can you do a demo showing the ASL finger spelling alphabet? This is a needed “holy grail” for any mocap hand tracking. We are currently struggling most with retargetting the hand finger bones as well as the inverse kinematics on the shoulders.. any insights into this area would be so helpful.

    • @CinematographyDatabase
      @CinematographyDatabase  2 роки тому

      I believe StretchSense is being used with a Vicon project with MetaHumans who do sign language. When I get back into MOCAP I’ll do some more high precision hand captures.

    • @mylesdb
      @mylesdb 2 роки тому

      @@CinematographyDatabase That'd be fantastic. You mentioned it in another video which I found after this one but looks like you didn't get around to the hand tracking demo yet. StretchSense seem to be the leaders in the hand tracking mocap space it seems but the Vicon tracker balls pretty pretty capable if you have one for each finger bone right? A little beyond our budget however at this time, so I'm playing around with Quest 2 for the hand-tracking part of the process in Unreal Engine.

  • @OTHONASG12
    @OTHONASG12 2 роки тому

    Hello , i have some difficulty's with retargeting in shogun live , did you found a tutorial to do it or any other help , am stuck here for a few weeks now and i have no idea what is going on

  • @josacra
    @josacra 2 роки тому

    WOW...

  • @JasonQueue
    @JasonQueue 2 роки тому +1

    are you able to hook up any sort of controller to the system too? say like make a real time blaster?

    • @CinematographyDatabase
      @CinematographyDatabase  2 роки тому +1

      yeah any wireless USB gun or controller would work. Attaching a Vive/Index controller could work too.

  • @user-ss9kl1vu3u
    @user-ss9kl1vu3u 4 місяці тому

    Добрый день. Расскажите где можно купить систему захвата движения

  • @armondtanz
    @armondtanz 2 роки тому

    This is on another level...How much your set up? Just curious to know what the indie budget is? Software and ALL hardware

    • @CinematographyDatabase
      @CinematographyDatabase  2 роки тому +1

      ua-cam.com/video/YgT6XY6ldj8/v-deo.html this video talks about the cost of the MOCAP system. This doesn't include the multiple computers and other support hardware I have for this setup

  • @virtualcircle285
    @virtualcircle285 2 роки тому

    Can you get some trained martial artists to spar or do a fight demo with it on? I guess you need two suits for that and no face tracking

    • @CinematographyDatabase
      @CinematographyDatabase  2 роки тому

      Two people at once is possible at some point. Trained martial artists isn’t likely any time soon though.

  • @HellSpawnRulerOfHell
    @HellSpawnRulerOfHell 2 роки тому +3

    10:32 SteamVR can do that for like a fraction of the cost. lol
    And since newer VR games/tests use dynamic hand poses for grabbing objects. Gripping the gun would actually look better. (Tho thats a software feature.)

    • @CinematographyDatabase
      @CinematographyDatabase  2 роки тому +2

      VR MOCAP is great and I’ve programmed multiple full body solvers with Vive Trackers and Index Controllers. One of their strengths is their positional accuracy.
      However, as you mentioned, the interaction is canned/baked and or a virtual collision. Which gives a perfect final result. But will never be different and if you want to interact with something the developer hasn’t scripted/posed, you can’t.
      This optical solve you could interact with any real world physical object and get a dynamic new hand pose every time.

    • @HellSpawnRulerOfHell
      @HellSpawnRulerOfHell 2 роки тому +1

      ​@@CinematographyDatabase With accurate finger tracking like with the Stretch Sens you can have the same level of accuracy with fingers.
      Also with virtual collision you can interact with digital objects get a dynamic new hand pose every time.

  • @chadgtr34
    @chadgtr34 2 роки тому

    if you are sweating from running on the treadmill, sweat will wet the sensor, is it a problem ?

  • @Fxiques
    @Fxiques 2 роки тому

    I would like to know exactly how much this whole system is worth to be able to do the same thing you are doing.
    One pass, the capture of the body, surrounding elements and facial.
    How much does this whole set cost?

  • @OperationBaboon
    @OperationBaboon 2 роки тому

    wouldn't it be more practical to place markers on the end tips of items, so the base form is a straight line? and can you assign certain marker groups to a specific virtual prop? so that if you'd "throw" in an item from off screen, it would know what item it is supposed to project onto it?
    i have been conceptualizing various virtual production methods for years, and this is the one that comes closest to what i came up with.

    • @CinematographyDatabase
      @CinematographyDatabase  2 роки тому +1

      you can do pretty much anything for prop markers in my experience so far. One of the markers becomes the origin/root and the others define the other axes, but besides that Shogun can recognize different props really well.

  • @gamehatchers
    @gamehatchers 2 роки тому +1

    @CinematographyDatabase
    What Vicon model system exactly are you using?

  • @lovedancexrsingleman8711
    @lovedancexrsingleman8711 2 роки тому

    what 's your VICON camera resolution? and how many cameras? it's cool.

  • @ady7d77
    @ady7d77 2 роки тому

    Make a course pls. I need it

  • @shortvideosi2626
    @shortvideosi2626 Рік тому

    Where you buy this mocap system

  • @josacra
    @josacra 2 роки тому

    Can you share all the software and hardwares you used to capture all this?

  • @wobbledaggerfilms
    @wobbledaggerfilms 2 роки тому

    Where do you see the price of optical systems like vicon moving to in two-three years?

    • @CinematographyDatabase
      @CinematographyDatabase  2 роки тому +2

      Vicon is very high end. It's literally medical/science grade motion tracking. So I don't see that price changing too much.
      BUT if there is non trivial amounts of interest from smaller/indie companies and user perhaps some sort of hardware/software package could be put together. I don't speak for them of course, just speculating.
      "Indies" needing a Vicon didn't exist IMO until recently with Unreal Engine (real time rendering being free) and MetaHumans and other high quality digital characters being available and affordable. So it's a bit of a new emerging market being accelerated by VTubers and general "Metaverse" popularity.

    • @wobbledaggerfilms
      @wobbledaggerfilms 2 роки тому

      @@CinematographyDatabase I appreciate the reply. My use case is for narrative film making which requires a high degree of fidelity with fingers, facial and bodily expression. AS far as I can tell, IMUs aren't up to the challenge yet. I have an award winning script and actor ready to go but just like other small scalers; none of the budget required to be an early adopter.

  • @shekiba.646
    @shekiba.646 2 роки тому

    MOCAP with track 6x or up vs Rokoko ? which the best ?

    • @CinematographyDatabase
      @CinematographyDatabase  2 роки тому

      Best quality MOCAP will almost always be optical MOCAP like Vicon. But "best" for your budget/space for many will be a simple inertial suit like Rokoko or Xsens.

    • @shekiba.646
      @shekiba.646 2 роки тому

      @@CinematographyDatabase I underatnd. I can't buy MOCAP vision becuase work a lot and large room with track etc and expiensive too.. but its easy Rokoko just body, gloves sign language $ 2745 cheap price.

    • @DjDialtone
      @DjDialtone 2 роки тому

      @@shekiba.646 hey I’ve got rokoko suit for sale if you want it

  • @jaickerag
    @jaickerag 2 роки тому

    Facebook will give you millions

  • @SuperlativeCG
    @SuperlativeCG 2 роки тому

    The results speak for themselves.

  • @Kitsune_Art
    @Kitsune_Art 11 місяців тому

    Hi, excuse me, how do you take body and face at the same time? D: