I work with people with disabilities who have limited movements. We were so excited by ARKit when it first dropped and we got to see Hawkeye app in action ; but I guess over time the the usefulness of front device camera eye tracking hasn't really proven itself in the real world. Hawkeye and a few of the other ARKit enable solutions were always a bit difficult to use and subject to interference from lighting etc. Sure Vision Pro has great eye tracking sensors but it's my understanding that access to those for third party apps is still not a thing. I can see two possible ways forward - Apple build and iOS and MacOS Accessibility setting that allows full system wide eye mouse control (possibly via the current Assistive Touch menus which is what the iOS MFI eye trackers use) or they build next gen outdoor capable eye tracking sensors (the kind that are in a few Assistive Tech eye trackers) into iPhones, iPads, and Macs - for mine , this would be the preferred option as it would enable eye tracking for all thereby democratising it and making the setting available for game devs etc.
Sean, AR just makes more sense on headsets. I know AR was on mobile devices but, I have a feeling it didn't take off because of the user needing to hold it for long lengths of time which probably got tiring. Thank you for these videos, I can't wait to see what can be done as of now that is on going question of which is Ok "so all I can do with the 3,500 device is watch videos?" I wonder is going to make the first killer app for AVP
I tend to agree - we really need a whole new UI/UX language for AR/VR/MR. The current platforms only have limited Accessibility (Apple SAY Vision Pro has Accessiblity at it's core but that is still coming SFAIK). Sarah Herlinger is a very impressive person though so she may sway things in the direction they need to go.
Just wanted to add since i ended up here while developing for the Vision Pro. Several ARKit features mentioned in this video are not possible on the Vision Pro currently. There is no face tracking or body tracking possibilities because of privacy concerns. This may change in the future but think those watching the video should be made aware that these things are not possible currently
@@davidharraway8131 you can have stuff occur while hovering over an item but the actual eye tracking isn’t shared on the developers side. But you can tell if an item has been hovered over. And in that case it usually means it is being looked at. Kind of a weird thing though because the documentation is very clear that you don’t have the actual eye tracking data
My girlfriend is a VFX artist and makes lots of usdz files. It's interesting how things she's doing come together here. This is why I love iOS development ;)
I think you are missing the big picture. The cell phone spent two decades from the 80' and 90's going from a big brick and luxury tool for business to more accessible tool in the late 90's and early 2000's to essential today. The same will happen now. I have had a list of independent techs who would have (and still will) pay 3x the cost of AVP of their own money for stuff i was working on back in 2018 with Hololens 2. But microsoft couldn't live up to the software side its and I was on Microsoft's AR research teams sharing independent data. There is MASSIVE opportunities on the business side for Pass Thru. Here is an early piece of work: ua-cam.com/video/Mw-GdoYupZ8/v-deo.html
@@enterprisar863 I took the comment to be less about the aesthetics, expense and usability of the hardware, and more about the potential societal impact this technology will have. With mobile phones, the technical development path was the easy (ish) bit, the societal change brought about by social media apps and cameras everywhere, was harder to predict. Still, cat's out the bag now, so best get on board and see what it can do.
Learn more with my iOS developer courses (Swift, SwiftUI, UIKit) - seanallen.teachable.com
It’s funny that it’s called ARKit but then they tell devs to never use the word AR in their communication.
I know, right?
I’m going to need a vision series pleaseeeee. I’m developing on visionOS for my research this summer and you would make my lifeeee
I have a bunch of videos (and a full course) planned over the coming months.
@@seanallen AWESOME!!! You really are the king of iOS developing youtube. nobody matches your effort and willingness to share knowledge. 👑
very informative video, @Sean I am struggling to find help for implementing clean architecture in swift ui, can you please make video on this
Architecture is a tricky topic. It can be very subjective and it depends a lot on the particular type of project.
Great video Sean!! Can't wait for the one on RealityKit! Keep it up dude! :D
Thanks! Will do!
I work with people with disabilities who have limited movements. We were so excited by ARKit when it first dropped and we got to see Hawkeye app in action ; but I guess over time the the usefulness of front device camera eye tracking hasn't really proven itself in the real world. Hawkeye and a few of the other ARKit enable solutions were always a bit difficult to use and subject to interference from lighting etc. Sure Vision Pro has great eye tracking sensors but it's my understanding that access to those for third party apps is still not a thing. I can see two possible ways forward - Apple build and iOS and MacOS Accessibility setting that allows full system wide eye mouse control (possibly via the current Assistive Touch menus which is what the iOS MFI eye trackers use) or they build next gen outdoor capable eye tracking sensors (the kind that are in a few Assistive Tech eye trackers) into iPhones, iPads, and Macs - for mine , this would be the preferred option as it would enable eye tracking for all thereby democratising it and making the setting available for game devs etc.
Sean, AR just makes more sense on headsets. I know AR was on mobile devices but, I have a feeling it didn't take off because of the user needing to hold it for long lengths of time which probably got tiring. Thank you for these videos, I can't wait to see what can be done as of now that is on going question of which is Ok "so all I can do with the 3,500 device is watch videos?" I wonder is going to make the first killer app for AVP
I tend to agree - we really need a whole new UI/UX language for AR/VR/MR. The current platforms only have limited Accessibility (Apple SAY Vision Pro has Accessiblity at it's core but that is still coming SFAIK). Sarah Herlinger is a very impressive person though so she may sway things in the direction they need to go.
I wonder why we can't use simple Plane Tracking (ARKit) in a bounded volume app, in the Shared Space. That would be so useful.
Hi, Sean. Does Face tracking and body tracking are avaliable in VisionOS now?
Just wanted to add since i ended up here while developing for the Vision Pro. Several ARKit features mentioned in this video are not possible on the Vision Pro currently. There is no face tracking or body tracking possibilities because of privacy concerns. This may change in the future but think those watching the video should be made aware that these things are not possible currently
Nor can you use eye tracking in third party apps yet I believe
@@davidharraway8131 you can have stuff occur while hovering over an item but the actual eye tracking isn’t shared on the developers side. But you can tell if an item has been hovered over. And in that case it usually means it is being looked at. Kind of a weird thing though because the documentation is very clear that you don’t have the actual eye tracking data
Imagine creating a nice bowling game for visionOS. This can be very interesting.
There's a lot of interesting possibilities...
Sean allen makes eveyrthing simple
My girlfriend is a VFX artist and makes lots of usdz files. It's interesting how things she's doing come together here. This is why I love iOS development ;)
001
The thought of seeing people wearing computers over their eyes everywhere I go makes me sick to my stomach.
It's ok. If you get a VisionPro headset, you can use AR to replace those people's goggled up eyes with a pair of cool shades, or whatever you like!
I think you are missing the big picture. The cell phone spent two decades from the 80' and 90's going from a big brick and luxury tool for business to more accessible tool in the late 90's and early 2000's to essential today. The same will happen now. I have had a list of independent techs who would have (and still will) pay 3x the cost of AVP of their own money for stuff i was working on back in 2018 with Hololens 2. But microsoft couldn't live up to the software side its and I was on Microsoft's AR research teams sharing independent data. There is MASSIVE opportunities on the business side for Pass Thru. Here is an early piece of work: ua-cam.com/video/Mw-GdoYupZ8/v-deo.html
@@enterprisar863 I took the comment to be less about the aesthetics, expense and usability of the hardware, and more about the potential societal impact this technology will have. With mobile phones, the technical development path was the easy (ish) bit, the societal change brought about by social media apps and cameras everywhere, was harder to predict.
Still, cat's out the bag now, so best get on board and see what it can do.
@@enterprisar863 I sincerely doubt you know what I'm talking about.
hi,
can you help me how I can manipulate blend shapes of a usdz file?
I would appreciate any advice
HYPEEEE
I'm very excited to see next video :)
I'm editing it as we speak :)
@@seanallen 🔥🔥🔥
Waiting upcoming lectures...
Coming soon.
Great content. Keep it coming for VisionOS
Thanks! And that's the plan...
Man I love your videos, really good!
Thanks! I appreciate that!
002
Nicely done
Glad you liked it!
fantastic!
Thanks!