Is iPhone Motion Capture Ready for Production? Move.AI or Xsens

Поділитися
Вставка
  • Опубліковано 5 жов 2024

КОМЕНТАРІ • 84

  • @Unrealhan
    @Unrealhan  Рік тому +4

    Which mocap are you using?

    • @MR3DDev
      @MR3DDev Рік тому +1

      Perception Neuron, but I am going to give Deepmotion a shot.

    • @Unrealhan
      @Unrealhan  Рік тому

      @@MR3DDev curious to know how they compare to xsens or rokoko

    • @MR3DDev
      @MR3DDev Рік тому

      @@Unrealhan Is easier setup than xsens (also no subscription fee) but the hand capturing is not good. Rokoko was pretty janky but I tested it in 2018 so it may be different now.

    • @Megasteakman
      @Megasteakman Рік тому +1

      Awesome look at this! I was tempted to look into move, but I think I will stick with my Unreal Engine Vive Tracker solution for the time being.

    • @m1sterv1sual
      @m1sterv1sual Рік тому +2

      Rokoko but soon going on Xsens

  • @landonp629
    @landonp629 Рік тому +16

    Another issue I have is your testing method. You are wearing baggy clothing without much contrast, which will also reduce the accuracy of the tracker. I use Move AI in a well-lite studio space with white walls, and the talent each wearing body-snug bright colored suits. Haven't had any real issue. I think too many people assume this is TOO casual of a method for capturing mocap data - you need to treat the studio space like you would a $50,000 IR tracking system, not like you would an inertia-based suit workflow.

    • @FinalGrade
      @FinalGrade Рік тому +3

      Spot on. Exactly. Its a camera based design. Also Xsens requires subscription too from what I can see. And if you want unlimited processing minutes it $250 a month. Pretty crazy. MoveAI also lets you use Iphone 8 instead for cheaper cameras or just use experimental mode and use different cameras entirely. Some guy in the moveai discord had these $25 cameras that fit the specifications by Moveai and had four set up getting great results.

    • @bovineox1111
      @bovineox1111 Рік тому +1

      ​@@FinalGradeI'm going to give this a go with some cheapish 1080p 60fps action cams

    • @riretf
      @riretf 3 години тому

      ​@@FinalGradeIf you use a cheap iPhone, the quality will be worse. You need 8 iPhone 13 Pros to get similar results to the Xsens.

  • @landonp629
    @landonp629 Рік тому +12

    $4,000 for 6 iPhones? Why? You can get used iPhone 11's for like $200 a pop. Maybe $1,200 on the high end.

    • @IamSH1VA
      @IamSH1VA Рік тому

      ~$50 Webcams also works well, don’t need to spend $200 for iPhone.

  • @robertturaa9561
    @robertturaa9561 6 місяців тому +2

    It's funny that people complained about the pricing of this mocap (from what I understand you are talking about the multi-cam option)
    NOW it's not $365 a year but $5,000 a year
    and not 30 minutes a month but 20 minutes a month.
    And the one camera option is also still more expensive than that full multi-camera plan was.
    I just paid $20 for a 1 month access of 3 minutes processing time.
    I was so hyped back then to try it out but I didn't have access to an iPhone... I finally got to borrow one or two but now I'm screwed again, this time by the pricing policy :(

  • @mocappys
    @mocappys Рік тому +12

    Great video, Han. Really interesting to see a comparison of the 2 systems.
    One of the keys to getting the most out of any mocap system is knowing exactly what it can and can’t do so you can plan round it, so this is really useful.
    I always find the claim “you don’t need a suit” a little cheeky. While you don’t need to wear a Lycra suit, you do need clothing that contrasts with the surroundings and the different parts of your body to improve tracking.

    • @Unrealhan
      @Unrealhan  Рік тому +2

      Agreed. Tricky part is it’s ai, so we just all guessing what will work best including clothing.

    • @mocappys
      @mocappys Рік тому +1

      @@Unrealhan yeah, it’s a shame the software can’t do a quick preview to give you some idea if it’s worked or not, a bit like raw marker/sensor data in other software. It’s the not knowing if it’s worked that’s a bit of a deal breaker for some.

  • @Xashadowin
    @Xashadowin Рік тому +3

    Really nice comparison. Let just add to that the fact that mostly... everyone has a phone capable of doing that. We tried it in our studio and if you combine someones phone + tablet, you end up pretty fast at the number of devices needed for the capture. In our test I just borrowed someones iPhone + my own + my other friends phone in the test and finally my ipad. We ended up with four devices at a total cost of... 0$. Which cuts the cost then to "only" the tracking fees.
    But for the accuracy and everything in this video thanks for comparing the exact same motion with two different technics!
    It's still super impressive for a tech that can be used by anyone!

  • @FinalGrade
    @FinalGrade Рік тому +3

    I put moveai with a period and youtube auto deleted it. Fantastic. Well lets try again. First off thank you for this video and information! I have to disagree however for the following points:
    1. You can spend way less on a Iphone 8 or even different cameras entirely. In experimental mode you can use any camera that fits the right specs. A guy got it working with $25 cameras. Moveai even started out with GoPros before Iphones.
    2. The 30 minute cap seems bad but then you realize that Xsens does the same thing unless you pay $250 a month for unlimited processing. A MONTH! And thats after buying the suit.
    3. Want to add another person with xsens? Well now you need another expensive suit. Or you have to record again separately and hope the animation syncs.
    4. Suiting up every single time with multiple actors can be a headache.
    I still think the price bracket Xsens wants is not comparable to moveai. It is much cheaper still to use moveai. Unless there is a way to process unlimited data with Xsens that is free I don't see how it is better if it has that same subscription issue with an even higher barrier of entry.

    • @riretf
      @riretf 2 години тому

      1: The cheaper the camera you use, the crappier the results will be. You can use a GoPro or an iPhone 8+, but the quality will be worse than a ROKOKO or PN3.
      3~4: Hardware: You can get similar results using cheap Awinda and xsens animate. Especially for live streaming, 60hz is basic. In the case of move ai, the result gets much worse as there are more people. There is a problem that each other's bodies block the camera and it is not captured, and it does not provide a huge range like vicon optitrack, so in many cases, I do not recommend the entire ai motion capture method.

  • @Kidultchar
    @Kidultchar Рік тому +5

    It's true that it's too expensive, but for amateur creators like me, there is no alternative other than move ai. I wish there was a monthly subscription plan.

  • @renbry
    @renbry Рік тому +6

    Thanks for the comparison and a very fair review. It was certainly eye opening when I got into Mocap via XSens Awinda a few years ago. Understanding that although there's a single term for this ("Mocap") there's actually so many subtle differences that one ought to understand - some you spoke of such as the volume where inertial suits can run loose in the wild vs in a volume. Some others that I had to learn the hard way : Do you want Live streamed mocap or just offline recordings? Do you want really, flawless mocap done Live or Offline? Two or more people touching with great registration between them? Props? Camera tracking? Visibility of the Mocap suit to a camera that may be filming?
    So many of these questions I didn't know i needed to ask but had to discover along the way. We've since gone with OptiTrack for great Live/Offline recording, multiple person registration, Props and Camera tracking although recently we were asked to do recording mocap of people wearing high fashion clothing - so the XSens came out of the cupboard because we could hide the sensors !
    It's a wonderful thing but gee, was it expensive learning all these points!
    Thank you for helping the community understand more about the topic!

  • @remcosikkema
    @remcosikkema Рік тому +2

    Thank you for the fair comparison, it really shows the strong and weak points of both systems. This information will for sure help those checking out different options, I would surely recommend they watch this video to get a good overview.

  • @Chainism
    @Chainism 10 місяців тому

    This was so well filmed, edited, narrated, and researched. Subscribed. Keep up the amazing work, you'll definitely take off buddy!

  • @MR3DDev
    @MR3DDev Рік тому +15

    30 minutes a month? Geez I got hours of mocap data for my 10 minute short film. These monetization methods are not good.

    • @FolkerHQ
      @FolkerHQ Рік тому +1

      it could be cheaper to rig and animate it by yourself 🙂 or ... have a look into iPi Recorder (ipisoft)

  • @lemmonsinmyeyes
    @lemmonsinmyeyes Рік тому +1

    This is awesome, thank you for the direct comparison!

  • @ArthurBaum
    @ArthurBaum Рік тому +1

    Very glad that somebody finally has a more critical view on that software solution. After two months of using it I gotta say they definitely hit the right direction but the execution is far from perfect. The most frustrating thing for me are the restrictions of the app: 1. You can't upload individual files, only all or none. 2. they have deactivated the function to share the raw recorded videos from the iPhones to other devices. 3. Wouldn't be that much of a problem but unfortunately the app is prone to crash when uploading a lot of takes (which is normal on long production days). The older the iPhone is the worse the problem gets. So, that's why their USP of cost-efficient productions is kind of invalid. Ultimately you end up buying or renting expensive iPhone models.

    • @Unrealhan
      @Unrealhan  Рік тому +1

      Very good point. User experience def more to be desired. The app literally is just doing recording, should be able to let us access raw files.

    • @ArthurBaum
      @ArthurBaum Рік тому +2

      @@Unrealhan You CAN ask the support for individual access to this function. Though that doesn't solve the problem of the app constantly crashing on uploads, especially on older models. It has something to do with bad memory management. The Devs know about this problem. Though they are still trying to push the narrative that only need iPhone 8s in order to do unlimited MoCap.

    • @anthonyganjou2485
      @anthonyganjou2485 Рік тому +1

      @@ArthurBaum working really hard on fixing all these problems as you know. We are only three weeks in from launch so should all be fixed and these gremlins solved in the next couple of weeks. Also appreciate your feedback on the functionality reviewing this too

    • @FinalGrade
      @FinalGrade Рік тому +2

      @@ArthurBaum You can actually go cheaper than iphone 8. A guy in the discord setup $25 cameras that fit within spec and ran 4 of them with great results. I don't know much about the connectivity issues you are having but seems like a pretty fantastic alternative to a $12000 mocap suit and $250 a month sub to motioncloud.

  • @knowyourfortune
    @knowyourfortune Рік тому

    It’s nice to see your channel going so well!

  • @gabrielkuklinski9806
    @gabrielkuklinski9806 Рік тому

    Han, I'm just addicted to your videos mate, I NEED A FULL COURSE OF YOURS SOON! any plans on doing it? Maybe a filmming master course or something? Cheers and always thank you for the amazing content!

  • @samgoldwater
    @samgoldwater Рік тому

    Fantastic elucidation of where these tools are at, thank you Han! 😎🥳🏆

  • @anthonyganjou2485
    @anthonyganjou2485 Рік тому +4

    Han thanks for the objective review.
    We’d welcome your thoughts on what a more viable free trial period might be alongside the minute allocation monthly. Also if you want to do more tests pls do reach out and we can facilitate more credits for you to continue to explore. Totally hear you on the connectivity issues two weeks out from launch we are still ironing out some gremlins from the system and making it much more robust.
    Just to confirm you are testing this against the MVN link which costs $12,000 and software which costs $5,000-10,000 pr/yr? If so it’s a thrill to see the data come out so closely comparable. Sorry you had to shoot in the rain but loads of people are shooting in much tighter spaces indoors using the wide angle lens on the 11 up. The hardware on a system like this can run as little as $1300 if you are buying refurbished iPhone 11s.

    • @Unrealhan
      @Unrealhan  Рік тому +3

      Yes the benchmark is MVN link, not a cheap suit for sure and I understand not everyone has access to system like that. MoveAI does present an alternative for sure.
      Good to know there's room for extra minutes for testing etc. I think the trial period should be 10 mins (i think it's 5 for beta users?). That should give ppl enough time to trouble shoot the equipment and run some actual test in different settings and environments. They probably already purchased the iPhones so most likely they will commit.
      In terms of monthly allocation, personally i would prefer a monthly subscription instead of annual. Reason being I only use mocap for the first half of a project (shooting and previs), not every month. So depends on each user case maybe the quota will vary but I think an average 30mins/month is not too flexible. I'm curious to know what the community thinks too, as it varies case by case.
      Love to hear what your road map is in terms of the data quality too, AI develops so fast you guys probably already working on some cool stuff 😉

    • @PeteD
      @PeteD Рік тому

      If you can decouple the processing from the export, that will open up some options. Allow users to test footage recording conditions/variables/workflow but limit how much they can actually export

    • @anthonyganjou2485
      @anthonyganjou2485 Рік тому

      @@PeteD appreciate your feedback thank you.

  • @cecka
    @cecka Рік тому

    Great summary and initial review - looking forward to hear more about this in the future.

  • @naq_montages
    @naq_montages Рік тому +2

    0:32 It would be so cool if you could give us some tips on filming two characters fighting while having only one motion capture suit

    • @Unrealhan
      @Unrealhan  Рік тому +3

      Good point. Will put on the list!

    • @FinalGrade
      @FinalGrade Рік тому +1

      @@Unrealhan I second this! Would love to see your workflow on that!

  • @entertainmentbilly2231
    @entertainmentbilly2231 Рік тому

    An update would be welcome. Now there is monthly payment option and experimental mode, which works with all sort of footages from any camera.

  • @TrashPraxis
    @TrashPraxis Рік тому

    Nice video! I was thinking to do this kind of video +optical but can't seem to get to it. Thanks for including the floor rolling and hands on surface, great to see how it does in these situations

  • @cpsstudios
    @cpsstudios Рік тому

    Needed that review, Thank you!!

  • @edcelmendoza9161
    @edcelmendoza9161 Рік тому

    Thank you for sharing and explaining Han!

  • @yodojo3493
    @yodojo3493 Рік тому

    Body Occlusion with 6 cameras! lol Great video!

  • @EnterMyDreams
    @EnterMyDreams Рік тому

    I love your content! Great breakdown, thank you so much

  • @jackdouglass1667
    @jackdouglass1667 5 місяців тому

    of course, its raining on the day when you booked everything out

  • @DaveK183
    @DaveK183 Рік тому +1

    Hmm, just checked all options for this budget motion capture and I'm very disappointed (nothing against Move tho). My main question is, how are VR googles so cheap, when you track the headset, and several controllers with perfect precision? It can do it for less than 1000 bucks, and you have a VR headset next to it? I would so much like some explanation.. thanks.

  • @GarrethDean1
    @GarrethDean1 6 місяців тому

    +1 SUB ! I have been looking for a great UA-camr that talks about retargeting in maya. The most daunting thing as a solo indie dev is character rigging + animating and retargeting. I have not gotten it working proper. Anyway, Great video! I was looking at Rokoko and an Iphone system. I think Rokoko is the way to go

  • @emo10001
    @emo10001 Рік тому

    This is super helpful. Thank you for an honest and comprehensive review with your OWN testing and data. Definitely a sub from me. I've got a Perception Neuron v32 which still works well for me. Markerless mocap at an affordable price with good fidelity...just doesn't quiiiite seem to be there ....yet

  • @gladiyit6771
    @gladiyit6771 Рік тому

    ty

  • @yiiarts6641
    @yiiarts6641 Рік тому

    Thx for your efforts mate. Can you also make a comparison video of xsense and rokoko? That would be great.

    • @Unrealhan
      @Unrealhan  Рік тому +3

      Get Xsens if you can. Rokoko can’t beat it not even close.

  • @wadekettell2626
    @wadekettell2626 Рік тому

    Good Review

  • @ahmeterkoc9502
    @ahmeterkoc9502 Рік тому

    What about finger? Is it good enough?

  • @prasoondhapola2875
    @prasoondhapola2875 Рік тому

    I think if wonder studio turns out to be as good as they claim, these existing solutions won't stay relevant much longer. Be it mov ai or mocap suits

    • @Unrealhan
      @Unrealhan  Рік тому +3

      I think wonder studio utilize similar tech only single camera based. I remain skeptical about how polished they are until someone test it.

  • @lovelearn341
    @lovelearn341 Рік тому

    Cool video! I see you mostly do your animations in unreal engine. And I have a question. How many poligons does your rigged model have? As I can see, it's done very well. I personally rig models in blender. Although they are great in LOD0. Blender can't let it move or render smoothly. Although I have a powerful computer. And it seems to be very smooth in unreal engine. Can you give tips if I need to know something about a UE model beforehand...

  • @cj5787
    @cj5787 Рік тому

    uhm.... so 365 USD for something you wouldn't use during the whole year with the lie of being $1 day price, but on top of that you only get 30min per month.....

  • @johndoe-cd9vt
    @johndoe-cd9vt Рік тому

    is it better to use vive trackers

    • @anthonyganjou2485
      @anthonyganjou2485 Рік тому

      The data quality isn’t remotely comparable

    • @shataluproductions
      @shataluproductions Рік тому

      @@anthonyganjou2485 in what way? Better or worse?

    • @dannylammy
      @dannylammy 7 місяців тому

      In my honest opinion after having tried it: nah.

  • @阿凯-v5l
    @阿凯-v5l Рік тому

    是你吗 我还买了你在翼虎的课程呢

  • @mick3d293
    @mick3d293 Рік тому +3

    Honestly you failed to research proper clothing and contrast. You mean well, but it is a misleading video. You should try again with different clothing

    • @riretf
      @riretf 2 години тому

      No matter what costume you wear, it's impossible to get better results than MVN LINK on a live or recorded basis because move ai's software lags behind MVN animate. In particular, the problem of recognizing fast repetitive movements as noise is very serious, and the problem of rolling on the floor is unrelated to the costume.

  • @lotusfilm1
    @lotusfilm1 Рік тому +1

    MoveAi only for who is learning about animation and mocap

    • @birdonfiremedia
      @birdonfiremedia Рік тому +4

      There's a few AAA studios who would disagree with you there. It's still being refined and improved, but it out-performed my Awinda Starter in my tests (without a 10K p/a Xsens Pro subscription, I ran into issues with the Xsens drifting that move doesnt give me, so cleanup is at least consistent)

    • @FinalGrade
      @FinalGrade Рік тому +1

      Disagree. Get creative with your mocap and telling a story. Be very specific about what you need an when. Once you do that it has a professional place to be used. Its just another tool. Cant afford crazy xsens prices? Well then you use moveai. That simple.

  • @UHHPIA
    @UHHPIA 2 місяці тому

    jesus christ loves you all