Thanks to your video Ryan! By the way, I wonder if there are any programmatic delegates of this feature.. Is there a way to restart the recorded video or hold it until I hit the start button?
Not that I know of. The video playback is something built into Xcode (runtime configuration, not code). I'll def experiment some more to see if looping or holding until play is possible. Best, Ryan
Hi Dora, good question. I haven’t tried it. Currently I’m not aware of automated testing for AR but I’m sure many of us desperately need it. If I find anything on it, I’ll def circle back 👍🏼 Best, Ryan
Is it possible to do this with for example an actor? Maybe their hourly rate is high so you only have access to the actor once and you'd like to replay that recorded session to retarget to another character's face instead of the character you used during the initial recording.
Thanks so much Ryan! I want to use this with "Object Tracking" and potentially can record and shoot AR session to place my animation. Still kind of an idea. I wonder if with iOS 14 whether Apple improves object tracking and we can actually anchor to moving objects. Like for example if we have a bottle, I want to keep tracking that bottle and make spidef goes around it.
Hi Ryan, Great video. Question, is there a method to copy a Scene (with the assigned behavior and Action sequences) from one Reality Composer file, and paste it into a completely new Realty Composer File?
Hi Ryan -- I love your channel! I am currently learning python and I am passionate about AR. I work in real estate and want to build augmented reality versions of floor plans. What would you recommend that I learn first? Should I learn python or another code? What AR platform would be best to build these things (unity, ar kit, etc)? Thank you for your help and keep the awesome videos coming!!
@@realityschool Thank you so much, Ryan! I prefer Apple as well. I have subscribed and have notifications on so I will be a routine watcher of your channel! As I get started I am sure I will have more questions and may even need some help -- when that time comes, you are certainly my first email. Again, thank you -- this is incredibly helpful. :)
Hello Ryan. Do you know if the recorded AR motion metadata can be exported as a 3D scene ? I would love to have an FBX file to import that object-free motion-tracked-camera into my favor tie PC 3D app for adding complex rendered characters
Hey Ryan! Really enjoying these videos a lot. Keep up the awesome work. I am working on developing an AR app with xcode and reality composer and I have the app running well, but I'm wanting to add 2D UI to my app before getting into the AR component. I'm new to xcode and I think I need a mainstoryboard, but the AR app through xcode does not automatically create one. I have the inital launch screen, but need to add some 2d UI. I'd love your help with this.
Can we unplug our device after we upload to the app? I would like to debug using the video without actually having the physical device next to me, because it often isnt available for me to use.
Good question. I haven’t tested this myself. I’m guessing that your iPhone or iPad has to be connected to a Mac for the debug with video to work. But again, I haven’t tried to unplug to see if it still worked. Best, Ryan
@@realityschool thank you! Im working on an app to analyze human movement using the joint positions of the skeleton and am very new to swift/not a programmer. Your videos have helped a lot.
Happy to hear that the videos are helpful. Body tracking is one of the more complex AR features so definitely don’t hesitate to reach out if you have any questions. 👍🏼
Do you know a way in code to record Arkit sessions (record video of the session)? There is a library called Scenekitvideorecorder that records AR sessions but uses the .shapshot method to feed the images to create the video and it's low quality. So If i want to record an AR session via code how would I do that ? Any help is deeply appreciated.
Hey Ryan i figured out the issue. That last scene with the different image anchor contained a usdz object i created online on Vectary. The issue ended up being the object not the image anchor. is there a specific process i have to go to in order to import a usdz object. cause all i did i click import in RC and chose the object? Thanks
Ryan Kopinsky yeah it works with Vectarty because the 3D object shows up on RC. Like color and design wise, it matched what I did on Vectary. Only issue is when I actually open the app after being built. The error mentions something about not locating an asset or something like that
Hi Ryan! Thank you for another useful video!
Thanks for this video. Very helpful. I was looking for such option.
Thanks to your video Ryan! By the way, I wonder if there are any programmatic delegates of this feature.. Is there a way to restart the recorded video or hold it until I hit the start button?
Not that I know of. The video playback is something built into Xcode (runtime configuration, not code). I'll def experiment some more to see if looping or holding until play is possible.
Best,
Ryan
OMGosh, I’m such a huge fan of the AR Testing. I can imagine that it makes XCUITests feasible in AR!
Thanks Ryan
Hi, is there any way to use AR recording video in XCUITest?
Hi Dora, good question. I haven’t tried it. Currently I’m not aware of automated testing for AR but I’m sure many of us desperately need it. If I find anything on it, I’ll def circle back 👍🏼
Best,
Ryan
Is it possible to do this with for example an actor? Maybe their hourly rate is high so you only have access to the actor once and you'd like to replay that recorded session to retarget to another character's face instead of the character you used during the initial recording.
Thanks so much Ryan! I want to use this with "Object Tracking" and potentially can record and shoot AR session to place my animation. Still kind of an idea.
I wonder if with iOS 14 whether Apple improves object tracking and we can actually anchor to moving objects. Like for example if we have a bottle, I want to keep tracking that bottle and make spidef goes around it.
Hi Ryan, Great video. Question, is there a method to copy a Scene (with the assigned behavior and Action sequences) from one Reality Composer file, and paste it into a completely new Realty Composer File?
Hi Ryan -- I love your channel! I am currently learning python and I am passionate about AR. I work in real estate and want to build augmented reality versions of floor plans. What would you recommend that I learn first? Should I learn python or another code? What AR platform would be best to build these things (unity, ar kit, etc)? Thank you for your help and keep the awesome videos coming!!
@@realityschool Thank you so much, Ryan! I prefer Apple as well. I have subscribed and have notifications on so I will be a routine watcher of your channel! As I get started I am sure I will have more questions and may even need some help -- when that time comes, you are certainly my first email. Again, thank you -- this is incredibly helpful. :)
Hello Ryan. Do you know if the recorded AR motion metadata can be exported as a 3D scene ? I would love to have an FBX file to import that object-free motion-tracked-camera into my favor tie PC 3D app for adding complex rendered characters
@@realityschool thank you so much
Can we test AR android app???
Hey Ryan! Really enjoying these videos a lot. Keep up the awesome work. I am working on developing an AR app with xcode and reality composer and I have the app running well, but I'm wanting to add 2D UI to my app before getting into the AR component. I'm new to xcode and I think I need a mainstoryboard, but the AR app through xcode does not automatically create one. I have the inital launch screen, but need to add some 2d UI. I'd love your help with this.
@@realityschool I have a figma prototype of the experience.
I'll send you a link to your email. Thanks!
Can we unplug our device after we upload to the app? I would like to debug using the video without actually having the physical device next to me, because it often isnt available for me to use.
Good question. I haven’t tested this myself. I’m guessing that your iPhone or iPad has to be connected to a Mac for the debug with video to work. But again, I haven’t tried to unplug to see if it still worked.
Best,
Ryan
@@realityschool thank you! Im working on an app to analyze human movement using the joint positions of the skeleton and am very new to swift/not a programmer. Your videos have helped a lot.
Happy to hear that the videos are helpful. Body tracking is one of the more complex AR features so definitely don’t hesitate to reach out if you have any questions. 👍🏼
Do you know a way in code to record Arkit sessions (record video of the session)? There is a library called Scenekitvideorecorder that records AR sessions but uses the .shapshot method to feed the images to create the video and it's low quality. So If i want to record an AR session via code how would I do that ?
Any help is deeply appreciated.
@@realityschool Thanks for the info I guess I'll just fiddle with the library code and hopefully improve it in some way.
Hey there
I have an idea for an app what should I do
@@realityschool i don't have any skills i just have an idea i tried to reach companies and the cost was too high for me
@@realityschool makes a lot of sense thank you so much for your time
Hey Ryan,
Great content! Sorry to bother you
Just wanted to check if you were able to take a look into the different image anchor issue?
Thanks,
Tareq
Ryan Kopinsky
Hey Ryan
I just emailed you screenshot and more information on the issue
Thank you so much!
Hey Ryan
i figured out the issue.
That last scene with the different image anchor contained a usdz object i created online on Vectary.
The issue ended up being the object not the image anchor.
is there a specific process i have to go to in order to import a usdz object.
cause all i did i click import in RC and chose the object?
Thanks
Ryan Kopinsky yeah it works with Vectarty because the 3D object shows up on RC. Like color and design wise, it matched what I did on Vectary. Only issue is when I actually open the app after being built. The error mentions something about not locating an asset or something like that
Ryan Kopinsky I don’t get the last sentence about exporting it
Ryan Kopinsky mac
I don't understand why it requires the real device to run :| I expected to use the emulator with pre-recored video.