Speed Up AR Testing by Recording Session Data // Reality Composer + RealityKit + ARKit

Поділитися
Вставка
  • Опубліковано 17 лис 2024

КОМЕНТАРІ • 35

  • @alexnovikov1609
    @alexnovikov1609 7 місяців тому

    Hi Ryan! Thank you for another useful video!

  • @joshiakhilesh
    @joshiakhilesh 2 роки тому +1

    Thanks for this video. Very helpful. I was looking for such option.

  • @박관홍-m7p
    @박관홍-m7p 3 роки тому +2

    Thanks to your video Ryan! By the way, I wonder if there are any programmatic delegates of this feature.. Is there a way to restart the recorded video or hold it until I hit the start button?

    • @realityschool
      @realityschool  3 роки тому +2

      Not that I know of. The video playback is something built into Xcode (runtime configuration, not code). I'll def experiment some more to see if looping or holding until play is possible.
      Best,
      Ryan

  • @PilgrimMatt42
    @PilgrimMatt42 4 роки тому

    OMGosh, I’m such a huge fan of the AR Testing. I can imagine that it makes XCUITests feasible in AR!

  • @StarkRaveness
    @StarkRaveness 3 роки тому

    Thanks Ryan

  • @doraliang3452
    @doraliang3452 3 роки тому +1

    Hi, is there any way to use AR recording video in XCUITest?

    • @realityschool
      @realityschool  3 роки тому +1

      Hi Dora, good question. I haven’t tried it. Currently I’m not aware of automated testing for AR but I’m sure many of us desperately need it. If I find anything on it, I’ll def circle back 👍🏼
      Best,
      Ryan

  • @Iron-David
    @Iron-David 2 роки тому

    Is it possible to do this with for example an actor? Maybe their hourly rate is high so you only have access to the actor once and you'd like to replay that recorded session to retarget to another character's face instead of the character you used during the initial recording.

  • @JimmyGunawanX
    @JimmyGunawanX 4 роки тому

    Thanks so much Ryan! I want to use this with "Object Tracking" and potentially can record and shoot AR session to place my animation. Still kind of an idea.
    I wonder if with iOS 14 whether Apple improves object tracking and we can actually anchor to moving objects. Like for example if we have a bottle, I want to keep tracking that bottle and make spidef goes around it.

  • @BUdJohnson242
    @BUdJohnson242 3 роки тому

    Hi Ryan, Great video. Question, is there a method to copy a Scene (with the assigned behavior and Action sequences) from one Reality Composer file, and paste it into a completely new Realty Composer File?

  • @sydneyphillips6531
    @sydneyphillips6531 4 роки тому

    Hi Ryan -- I love your channel! I am currently learning python and I am passionate about AR. I work in real estate and want to build augmented reality versions of floor plans. What would you recommend that I learn first? Should I learn python or another code? What AR platform would be best to build these things (unity, ar kit, etc)? Thank you for your help and keep the awesome videos coming!!

    • @sydneyphillips6531
      @sydneyphillips6531 4 роки тому

      @@realityschool Thank you so much, Ryan! I prefer Apple as well. I have subscribed and have notifications on so I will be a routine watcher of your channel! As I get started I am sure I will have more questions and may even need some help -- when that time comes, you are certainly my first email. Again, thank you -- this is incredibly helpful. :)

  • @menamariano
    @menamariano 4 роки тому

    Hello Ryan. Do you know if the recorded AR motion metadata can be exported as a 3D scene ? I would love to have an FBX file to import that object-free motion-tracked-camera into my favor tie PC 3D app for adding complex rendered characters

    • @menamariano
      @menamariano 4 роки тому

      @@realityschool thank you so much

  • @aniketkadukar7772
    @aniketkadukar7772 Рік тому

    Can we test AR android app???

  • @chadtetzlaff9556
    @chadtetzlaff9556 4 роки тому

    Hey Ryan! Really enjoying these videos a lot. Keep up the awesome work. I am working on developing an AR app with xcode and reality composer and I have the app running well, but I'm wanting to add 2D UI to my app before getting into the AR component. I'm new to xcode and I think I need a mainstoryboard, but the AR app through xcode does not automatically create one. I have the inital launch screen, but need to add some 2d UI. I'd love your help with this.

    • @chadtetzlaff9556
      @chadtetzlaff9556 4 роки тому

      @@realityschool I have a figma prototype of the experience.

    • @chadtetzlaff9556
      @chadtetzlaff9556 4 роки тому

      I'll send you a link to your email. Thanks!

  • @drdil6056
    @drdil6056 3 роки тому

    Can we unplug our device after we upload to the app? I would like to debug using the video without actually having the physical device next to me, because it often isnt available for me to use.

    • @realityschool
      @realityschool  3 роки тому

      Good question. I haven’t tested this myself. I’m guessing that your iPhone or iPad has to be connected to a Mac for the debug with video to work. But again, I haven’t tried to unplug to see if it still worked.
      Best,
      Ryan

    • @drdil6056
      @drdil6056 3 роки тому +1

      @@realityschool thank you! Im working on an app to analyze human movement using the joint positions of the skeleton and am very new to swift/not a programmer. Your videos have helped a lot.

    • @realityschool
      @realityschool  3 роки тому

      Happy to hear that the videos are helpful. Body tracking is one of the more complex AR features so definitely don’t hesitate to reach out if you have any questions. 👍🏼

  • @RatherBeCancelledThanHandled
    @RatherBeCancelledThanHandled 4 роки тому

    Do you know a way in code to record Arkit sessions (record video of the session)? There is a library called Scenekitvideorecorder that records AR sessions but uses the .shapshot method to feed the images to create the video and it's low quality. So If i want to record an AR session via code how would I do that ?
    Any help is deeply appreciated.

    • @RatherBeCancelledThanHandled
      @RatherBeCancelledThanHandled 4 роки тому

      @@realityschool Thanks for the info I guess I'll just fiddle with the library code and hopefully improve it in some way.

  • @sandra96160
    @sandra96160 4 роки тому

    Hey there
    I have an idea for an app what should I do

    • @sandra96160
      @sandra96160 4 роки тому

      @@realityschool i don't have any skills i just have an idea i tried to reach companies and the cost was too high for me

    • @sandra96160
      @sandra96160 4 роки тому

      @@realityschool makes a lot of sense thank you so much for your time

  • @tareqlabeeb2000
    @tareqlabeeb2000 4 роки тому

    Hey Ryan,
    Great content! Sorry to bother you
    Just wanted to check if you were able to take a look into the different image anchor issue?
    Thanks,
    Tareq

    • @tareqlabeeb2000
      @tareqlabeeb2000 4 роки тому

      Ryan Kopinsky
      Hey Ryan
      I just emailed you screenshot and more information on the issue
      Thank you so much!

    • @tareqlabeeb2000
      @tareqlabeeb2000 4 роки тому

      Hey Ryan
      i figured out the issue.
      That last scene with the different image anchor contained a usdz object i created online on Vectary.
      The issue ended up being the object not the image anchor.
      is there a specific process i have to go to in order to import a usdz object.
      cause all i did i click import in RC and chose the object?
      Thanks

    • @tareqlabeeb2000
      @tareqlabeeb2000 4 роки тому

      Ryan Kopinsky yeah it works with Vectarty because the 3D object shows up on RC. Like color and design wise, it matched what I did on Vectary. Only issue is when I actually open the app after being built. The error mentions something about not locating an asset or something like that

    • @tareqlabeeb2000
      @tareqlabeeb2000 4 роки тому

      Ryan Kopinsky I don’t get the last sentence about exporting it

    • @tareqlabeeb2000
      @tareqlabeeb2000 4 роки тому

      Ryan Kopinsky mac

  • @ДмитрийАнтоненко-н3ы

    I don't understand why it requires the real device to run :| I expected to use the emulator with pre-recored video.