Apple Motion Capture Using SwiftUI, ARKit + RealityKit

Поділитися
Вставка
  • Опубліковано 28 вер 2024

КОМЕНТАРІ • 69

  • @TreyHope
    @TreyHope 7 місяців тому

    This was really good. I’m learning to use ARKit in Flutter rn so a lot of those videos around swift, especially ones like these, are very helpful.

  • @DanieleCeglia
    @DanieleCeglia 2 роки тому +3

    Hello,
    first of all thanks for this tutorial!
    I have followed all the steps and the app works,
    however there are two things I have noticed that are not working well:
    1. if the person rotates on itself, the skeleton does not rotate
    2. if the device rotates on itself before to frame the person then the skeleton appears rotated differently from the person (it is rotated as the device was rotated at the start of the app)
    Probably the two problems are a single problem, but I cannot understand how and where I can indicate the correct rotation to the skeleton (starting from the ARBodyAnchor I suppose)...

  • @robotman011
    @robotman011 2 роки тому

    Welcome back! Looking forward to more Apple AR content!

  • @DanieleCeglia
    @DanieleCeglia 2 роки тому +2

    And another thing is not clear to me.
    In the init method we instantiate jointEntity and boneEntity and add them as children of the parent entity.
    But these child entities (in particular jointEntity) have no translation position, rotation or even scale...
    So they should collapse on top of each other in the center of the anchor entity thats contains the parent entity!
    Why doesn't this happen and we see the whole skeleton appear correctly?

  • @arttimeanytime2210
    @arttimeanytime2210 2 роки тому +3

    You’re back 😍

  • @AA-iw9sq
    @AA-iw9sq 4 місяці тому

    Can I use front camera of iPhone to do the body tracking?

  • @CinematicAdventureOne
    @CinematicAdventureOne 2 роки тому

    Thank you Ryan, another great video.

  • @SapnaSharma-oe3wg
    @SapnaSharma-oe3wg 2 роки тому +1

    Hi Ryan, Could you please mention the list of supporting device for this particular code?

  • @NhanTrann
    @NhanTrann Рік тому

    Thank you! Do you have suggestion on how to also run other inference during or after body tracking is running? (e.g., object detection using the Vision API). I can imagine a pipeline that looks like: Camera -> ARKit (ARBodyTrackingConfiguration) -> Vision (or maybe CoreML). For example, if we can track that the human body is doing a certain pose, we can then trigger detection of the kind of shirt they're wearing

  • @geomichelon
    @geomichelon 11 місяців тому

    its possible generate a previous set of moviments and compare with my skeleton moviments? To be clear, compare two skeletons positions?

  • @natgenesis5038
    @natgenesis5038 10 місяців тому

    How can I learn swiftUI and ARKit ???

  • @1Chitus
    @1Chitus Рік тому +1

    Thank you

  • @unquestionabletv
    @unquestionabletv Рік тому

    Are tracking points better now? These seemed unreliable if needed for accurate data.

    • @realityschool
      @realityschool  Рік тому

      I haven’t been able to get very accurate data (data I would for example use in a medical app). I’ll see if any other experts in our community have been able to achieve higher accuracy.
      Best,
      Ryan

  • @MichaelNinoEvensen
    @MichaelNinoEvensen Рік тому +3

    Such a great tutorial! You are extremely articulate and describe your steps super clearly. Thanks for putting this together!

    • @realityschool
      @realityschool  Рік тому

      Thanks Michael! So happy to hear you enjoyed the tutorial.
      Best,
      Ryan

  • @Classkit_tech
    @Classkit_tech Рік тому

    Thank you 😊

  • @storiesuwu
    @storiesuwu Рік тому

    Please can you help
    i get an error on the ARViewContainer on line 19 and ARView underlined red and says fatal error what shall i do

    • @storiesuwu
      @storiesuwu Рік тому

      i have copied everything but this is what its saying

    • @realityschool
      @realityschool  Рік тому

      Do you have a physical device (iPhone) plugged into your Mac and selected in the drop dow list?

    • @storiesuwu
      @storiesuwu Рік тому

      @@realityschool yes i do with my other ar apps it works perfectly but this one is not

    • @storiesuwu
      @storiesuwu Рік тому

      I just have not figured it out yet

    • @storiesuwu
      @storiesuwu Рік тому

      @@realityschool i cant find a fix anywhere

  • @jophuijbers2001
    @jophuijbers2001 2 роки тому

    Followed the tutorial and the app builds just fine, but it freezes whenever I point my camera at a person. The console prints out: : ARSessionDelegate is retaining 15 ARFrames. This can lead to future camera frames being dropped.
    How can I fix this?
    Tested on iPhone XS, iOS 15.6

  • @dickspargel628
    @dickspargel628 Рік тому

    Is there any way to detect and track wrist, arm, and hand points only from a tight angle, like if I am holding my phone and looking at my arm?

  • @CinematicAdventureOne
    @CinematicAdventureOne 2 роки тому +1

    Hi Ryan, I just tested it on my phone and it worked pretty well. Actually, my phone went to sleep during the test. It might be good to add a line of code (UIApplication.shared.isIdleTimerDisabled = true) to avoid sleep. What do you think? Thanks.

    • @realityschool
      @realityschool  2 роки тому

      Hey Frank, that is a great suggestion. For most AR apps, it’ll be good to prevent the device from dimming the displaying and / or going to sleep. Thank you for the comment!
      Best,
      Ryan

    • @CinematicAdventureOne
      @CinematicAdventureOne 2 роки тому

      @@realityschool Thank you Ryan.

    • @artyom.mihailovich
      @artyom.mihailovich 2 роки тому +1

      Also, for reduce memory, in context of using such the heavy framework as RealityKit, can be helped to disable some features, such as
      arView.renderOptions = [.disableMotionBlur,
      .disableDepthOfField,
      .disableHDR].
      I also drew attention to the fact that you can access ARSessionDelegate using defer {...}, it shoots faster.

  • @deimovprojects
    @deimovprojects 3 місяці тому

    Hello. excelent tutorial.... and how i could put a 3D tshirt following the body??

  • @Lyn3z
    @Lyn3z 2 роки тому +1

    Hey :)
    Thanks for all your work!
    Is it possible to record and export the motion data to other programs to later smooth the animations and use them for characters in e.g. blender?

    • @realityschool
      @realityschool  2 роки тому +5

      Good question. All the joint data is available in the transforms so you could potentially store the data in memory and afterwords archive to json. That being said, there isn’t an easy way to just “export data”. I’m assuming that whatever software receives the data would also want it in a very specific format. Maybe blender or unreal engine have a (third-party) plugin or app to capture and export the data but I haven’t really done a deep dive on that. Hope this helps!
      Best,
      Ryan

  • @harrymack8640
    @harrymack8640 Рік тому +1

    hello, new to this sorta stuff, great video by the way, would this be able to be used for a augmented reality garment try on in real time? if i have a large screen and camera set up?

    • @realityschool
      @realityschool  Рік тому +1

      Good question. I don’t think it will work for virtual try on garments because it detects the “bones” of the human skeleton but not the actual body. In order to assess fit for a garment, I imagine you’d want the mesh of the human body instead of just the joints and bones. BodyTracking might work for you if you just want to overlay a digital fashion piece for fun. But accurate fit assessment won’t be possible with this method. Hope that helps!
      Best,
      Ryan

  • @Marc-AndréWeibezahn
    @Marc-AndréWeibezahn Рік тому

    Can Motion Capture be used in macOS Catalyst apps? So far I have no luck with that.

  • @arkemal
    @arkemal Рік тому

    Nice demo! Is it possible to use the body tracking data and apply it to a mesh in real time? With RealityKit

  • @CtrlHelp
    @CtrlHelp 6 місяців тому

    ar there any possibility to use augmented phone front camera?

  • @이가은-c4g
    @이가은-c4g 7 місяців тому

    Thanks ! It was a lot of fun !!! If it can, I wanna see the github,,!

  • @darrinegrove
    @darrinegrove Рік тому +1

    Hi Ryan - thanks - super helpful! I got the code from Patreon, and my skeleton is mirrored, i.e., when I raise my right hand, the skeleton raises its left. Have you seen that? Anyone else? Any solution?

    • @realityschool
      @realityschool  Рік тому

      Hey Darrin! So after copy pasting the code, when you raise ur left hand, right of the skeleton goes up, and when raising right hand, the left goes up? How about the legs and feet? This is the first time I’m hearing of this bug but we’ll def get it sorted out! 🙏🏼
      Best,
      Ryan

    • @darrinegrove
      @darrinegrove Рік тому

      @@realityschool Ryan, correct, and same with the feet. I've read that ARKit sometimes interprets the body as backward facing. Wondering if that has something to do with it. Please let me know if there's a better place to work this out. Patreon mentions a Discord channel for Pro members, but I didn't find a link.

    • @darrinegrove
      @darrinegrove Рік тому

      @@realityschool For reference, I'm using an iPhone 14 Pro, iOS 16.0.3, Xcode 14.0.1

    • @realityschool
      @realityschool  Рік тому

      Got it! I have a 14 pro max so that should be great to see if it might be a device/config parameter or something else that is causing the bug. I will test the code as-is from Patreon and report back if I can reproduce and how to fix. If that doesn’t work, we’ll explore other options 👍🏼

    • @realityschool
      @realityschool  Рік тому

      I just ran the exact copy of Patreon code with Xcode 14.0.1, iOS 16.1 on iPhone 14 Pro Max and the skeleton isn't mirrored. Meaning I wasn't able to reproduce the bug. I also tried on an iPad Mini + Swift Playgrounds and skeleton is properly positioned. When creating the project, did you watch the video first and then copy paste the code or just straight copy paste? There may have been a step missed if just copy pasting.
      Just to make sure, can you check that your info.plist has a Privacy - Camera Usage Description? Also what happens if you run the body tracking code in a different environment (e.g. well-lit room)?

  • @alexnovikov1609
    @alexnovikov1609 10 місяців тому

    Wow! It works! Thank you!

  • @SapnaSharma-oe3wg
    @SapnaSharma-oe3wg 2 роки тому

    Can we count the body actions with this?

  • @bahadrsonmez7522
    @bahadrsonmez7522 2 роки тому

    Thank you, Ryan, for that fantastic video. It was so helpful and instructive.

    • @realityschool
      @realityschool  2 роки тому

      Thank you, Bahadir! Happy to hear you enjoyed the video 🥳
      Best,
      Ryan

  • @pal.techXR
    @pal.techXR Рік тому

    Hello Ryan,
    I followed your tutorial step by step. The project build is successful but still, skeleton points aren't visible. Can you please help?

    • @pal.techXR
      @pal.techXR Рік тому

      I am using Xcode 14.3.1 and testing the project on iPad Pro M1 chip

    • @realityschool
      @realityschool  Рік тому

      Thanks for reaching out. Can you add a few print statements throughout the code (when skeleton gets detected, when it’s first added and when it gets updated). This will help us find where things are potentially not working out.
      What sometimes also happens is that the skeleton is added to the scene but for some reason is placed somewhere else in the scene. Please use the iPad to look around to make sure it isn’t visible anywhere in the space.

    • @pal.techXR
      @pal.techXR Рік тому

      @@realityschool Only a single bone is visible in the surrounding. It not getting tracked by the user.
      Here is logs while executing the code:
      "2023-08-04 11:11:02.492970+0530 BodyTracking[3410:2257506] Metal GPU Frame Capture Enabled
      2023-08-04 11:11:02.493091+0530 BodyTracking[3410:2257506] Metal API Validation Enabled
      2023-08-04 11:11:02.581743+0530 BodyTracking[3410:2257506] [Foundation.IO] Could not locate file 'default-binaryarchive.metallib' in bundle.
      2023-08-04 11:11:02.792955+0530 BodyTracking[3410:2257506] [ECS.Core] Class for component AccessibilityComponent already registered
      2023-08-04 11:11:02.838259+0530 BodyTracking[3410:2257506] [AssetTypes] Registering library (/System/Library/PrivateFrameworks/CoreRE.framework/default.metallib) that already exists in shader manager. Library will be overwritten.
      2023-08-04 11:11:02.999712+0530 BodyTracking[3410:2257506] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/suFeatheringCreateMergedOcclusionMask.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
      2023-08-04 11:11:03.028188+0530 BodyTracking[3410:2257506] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arKitPassthrough.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
      2023-08-04 11:11:03.030685+0530 BodyTracking[3410:2257506] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/drPostAndComposition.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
      2023-08-04 11:11:03.031450+0530 BodyTracking[3410:2257506] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arSegmentationComposite.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
      2023-08-04 11:11:03.032150+0530 BodyTracking[3410:2257506] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute0.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
      2023-08-04 11:11:03.033215+0530 BodyTracking[3410:2257506] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute1.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
      2023-08-04 11:11:03.033631+0530 BodyTracking[3410:2257506] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute2.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
      2023-08-04 11:11:03.042079+0530 BodyTracking[3410:2257506] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute3.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
      2023-08-04 11:11:03.042527+0530 BodyTracking[3410:2257506] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute4.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
      2023-08-04 11:11:03.042915+0530 BodyTracking[3410:2257506] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute5.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
      2023-08-04 11:11:03.043361+0530 BodyTracking[3410:2257506] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute6.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
      2023-08-04 11:11:03.043775+0530 BodyTracking[3410:2257506] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute7.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
      2023-08-04 11:11:03.044427+0530 BodyTracking[3410:2257506] [Foundation.Serialization] Json Parse Error line 18: Json Deserialization; unknown member 'EnableARProbes' - skipping.
      2023-08-04 11:11:03.044458+0530 BodyTracking[3410:2257506] [Foundation.Serialization] Json Parse Error line 20: Json Deserialization; unknown member 'EnableGuidedFilterOcclusion' - skipping.
      2023-08-04 11:11:03.167867+0530 BodyTracking[3410:2257506] Successfully load keyboard extensions"

  • @knzev
    @knzev 2 роки тому

    Nice to see you again!

  • @floresjuarezdavidalejandro7683
    @floresjuarezdavidalejandro7683 2 роки тому

    Great video, very well explained. Have you already reviewed the new features of ARKit 6?

    • @realityschool
      @realityschool  2 роки тому

      Yes, I’ve been exploring the updates in ARKIt 6. It’s mostly improvements to quality and speed. Will likely do a livestream to share my thoughts.
      Best,
      Ryan

  • @genkidama7385
    @genkidama7385 Рік тому

    this is so ridiculous, they took microsoft kinect technology, and cant even make results as good as what was done 10 years ago. kinect v1 and v2 had better mocap than this garbage. even the fullbody skleleton is resized every frame and floats in mid air.