1. A second full user account. 2. Saved window placement, and for different locations (office, family room etc.). 3. Full controller support for games.
Create your own environment,,,,, go to your favourite local park, beach, etc,, take a scan with Vision Pro or a 360 camera (gopro or insta 360) and voila,, your own personal environment
support for iphone and ipad. It's crazy that Apple have spent years selling the idea of an ecosystem that vision pro can't and wont connect into in any meaningful way. That said, I love my vision pro!
@@roki6467 I guess they worked so hard on the hardware and UI that likely felt they would be best to just get it out and work on it from there. Seems to be a theme with many companies now. Work on it hardware and know they can fix or upgrade it in software later down the line.
1) Faster boot times so it’s closer to instant on. Also work with Financial institutions to certify retinal scans in lieu of having to enter a PIN every power on. 2) Connect to iPhone and iPad like Macs. The physical display blanks but you have a window to interact with. Ideally it would have the option of snapping to the physical device so you could interact with it as normal 3) Save windows by location or by task. For example “general productivity”, “software development”, “balance my finances”
I would like the ability to lidar scan an object using the AVP or my iPhone, specifically my favourite coffee cup, such that the Vision Pro recognizes it and always punches it through the environment so I don’t need to pull myself out of immersion to find it and take a sip.
@@eunoia86 have you used Polycam much on your phone? I love using it, in truth the photogrammetry mode is better than lidar and if you could see Gaussian splats they would be amazing to see in Vision Pro.
@@AVisionexperiment Oh interesting. Had assumed lidar was necessary for the accurate model. I guess I’m agnostic as to the technology used. But Apple if you’re listening, whatever you need to do to give me my coffee cup back in immersion, please do that
@@eunoia86 download Polycam on phone to start scanning and do the photo option, 100% better than Lidar and download the Apple Vision Pro app for Polycam to see some examples, Gaussian’s don’t seem to work for me but other scans do. I have some under the user name Ty_kix check my scans out in there glory on AVP. Also check out this video about 3D models. Polycam is the second app. LIFE IN 3D | Top 3 Apple Vision Pro 3D Model Apps ua-cam.com/video/uIgInJtKs88/v-deo.html
This item needs home sharing for those of us who created audio/video collections going back to the iPods and apple tv since 2007. I can’t play any of that media at present.
So I use a sit stand desk. It would be brilliant if I could tell VisionOS this so that when I go from standing to sitting my apps don’t just remain floating in the sky! Additionally I’d like to be able to access Mac virtual Display even if my Mac is locked. If they’ll let us connect to our iPhones while they are locked (on Mac) then connecting to a locked Mac shouldn’t be that hard 😅
There are so many things that could improve this product. - ability to split the super wide into two discrete displays so they could be stacked. - multiple user mode - come on Apple, this is an expensive product and a personal one. You market it as family friendly, stop making it so hard to share between family members - custom environments - could be shot with an iPhone and computationally stitched together - like watch faces, this is probably never going to happen as it allows Apple the illusion of being able to seemingly release new features - continuity with iPhone and iPad- see the iPhone/iPad and mirror the display perfectly in space - same with Mac - start the virtual display by just copying the display and allowing controls to pop it out to enlarged, wide and super wide. With a display like an iMac or Studio Display, an option could be to add a display. Even an Apple TV virtual share would be a great option. - audio mixer to allow more than one source of audio and make the audio spatial so that it comes from the different part of this space. The Quest 3 does this. - share the Mac audio into the headset when using the Mac Virtual display - allow the use of a mouse as well as a trackpad. Eye tracking flips the focus in 3D space and the mouse controls the pointer in this window. It works perfectly on the Quest and I suspect it’s a cynical hardware constraint to get people to but trackpads. - room workspace profiles - allow the launching of profiles of windows that can be launched perfectly aligned in a given space. So, an office profile has multiple windows of apps and a living room may have a cinema setup. - of course the big one should be Apple Intelligence features - native ability to run a spatial persona through as a webcam to the Mac - there is an app that does this. Really they need a Spatial Persona API - better aware of context switching while editing text and looking other windows so it’s not so clunky - pencil 3 pairing to allow drawing or writing on any surface including and virtual whiteboard laid on a plain wall, a notepad etc. - ability to scan a paper document instantly and bring it into the spatial environment as a PDF - this should be pretty easy given the number of cameras.
@@NeilLavitt great list of possibilities and I really do hope Apple listens, there is plenty of time and now with VisionOS 2 scheduled to launch on 16th Sept, we now have a loads of opportunities to get these points across. Thank you for taking the time to put this out there.
if your brain can handle it,. also look into the TAP Strap 2 or TAP XR devices once nice bit is you can use it to type on your leg while standing even its a one handed gesture based device
I do not enjoy watching a person in goggles talk to me. If you are going to address the camera directly, remove the device. If you are demonstrating a specific feature that requires you to wear the AVP, then put it on. Sorry to be critical, but I thought you should know. You make some good points here. I'm curious to see how this product grows over time. I'm currently deciding if I want to wear my AVP for the keynote.
@@stevedorsey1269 if you watched my last video, you’ll understand that my script is in the headset in notes so it’s one of the main reasons for keeping the headset on, I would like a hack that allows my persona eyes to show all the time so it looks like I’m engaging with the camera. I will do content without it on, but more content that’s not set out or scripted. Thanks for watching and commenting.
@@NeilLavitt if there are overlays to use I will definitely use them, some things that don’t exist are hard to create. Maybe I need to look at some quality AI image generators to show my “visions” no pun intended
1. A second full user account.
2. Saved window placement, and for different locations (office, family room etc.).
3. Full controller support for games.
@@barharborme yes great points and ones I will add to my list for an update video down the line. Thank you for watching
Good points ❤
Create your own environment,,,,, go to your favourite local park, beach, etc,, take a scan with Vision Pro or a 360 camera (gopro or insta 360) and voila,, your own personal environment
We need the ability for iPhone to handoff phone calls to AVP.
@@petermunoz-bennett386 I would like that, that’s a great suggestion
support for iphone and ipad. It's crazy that Apple have spent years selling the idea of an ecosystem that vision pro can't and wont connect into in any meaningful way. That said, I love my vision pro!
@@roki6467 I guess they worked so hard on the hardware and UI that likely felt they would be best to just get it out and work on it from there. Seems to be a theme with many companies now. Work on it hardware and know they can fix or upgrade it in software later down the line.
1) Faster boot times so it’s closer to instant on. Also work with Financial institutions to certify retinal scans in lieu of having to enter a PIN every power on.
2) Connect to iPhone and iPad like Macs. The physical display blanks but you have a window to interact with. Ideally it would have the option of snapping to the physical device so you could interact with it as normal
3) Save windows by location or by task. For example “general productivity”, “software development”, “balance my finances”
I would like the ability to lidar scan an object using the AVP or my iPhone, specifically my favourite coffee cup, such that the Vision Pro recognizes it and always punches it through the environment so I don’t need to pull myself out of immersion to find it and take a sip.
@@eunoia86 have you used Polycam much on your phone? I love using it, in truth the photogrammetry mode is better than lidar and if you could see Gaussian splats they would be amazing to see in Vision Pro.
@@AVisionexperiment Oh interesting. Had assumed lidar was necessary for the accurate model. I guess I’m agnostic as to the technology used. But Apple if you’re listening, whatever you need to do to give me my coffee cup back in immersion, please do that
@@eunoia86 download Polycam on phone to start scanning and do the photo option, 100% better than Lidar and download the Apple Vision Pro app for Polycam to see some examples, Gaussian’s don’t seem to work for me but other scans do. I have some under the user name Ty_kix check my scans out in there glory on AVP.
Also check out this video about 3D models. Polycam is the second app.
LIFE IN 3D | Top 3 Apple Vision Pro 3D Model Apps
ua-cam.com/video/uIgInJtKs88/v-deo.html
This item needs home sharing for those of us who created audio/video collections going back to the iPods and apple tv since 2007. I can’t play any of that media at present.
Apple should add cellular connectivity and allow carriers to subsidize the device. 🤯
Apple Pencil for Vision Pro!
Golden underrated idea there about the iPad guest app. Would change a lot!
@@josselincol I think with a lightweight eye calibration that guest mode would be really nice to have
I want the full keyboard like a Magic Keyboard ,but not iPadOS keyboard
So I use a sit stand desk. It would be brilliant if I could tell VisionOS this so that when I go from standing to sitting my apps don’t just remain floating in the sky!
Additionally I’d like to be able to access Mac virtual Display even if my Mac is locked. If they’ll let us connect to our iPhones while they are locked (on Mac) then connecting to a locked Mac shouldn’t be that hard 😅
@@kokofrost6096 some very good ideas and ones we need to really think about as positive additions, thank you for watching.
I need folders so bad. I like to be organized and i forget half the apps i have since they are pages back
@@eyeDavid yes this is must as more apps launch onto the App Store.
There are so many things that could improve this product.
- ability to split the super wide into two discrete displays so they could be stacked.
- multiple user mode - come on Apple, this is an expensive product and a personal one. You market it as family friendly, stop making it so hard to share between family members
- custom environments - could be shot with an iPhone and computationally stitched together - like watch faces, this is probably never going to happen as it allows Apple the illusion of being able to seemingly release new features
- continuity with iPhone and iPad- see the iPhone/iPad and mirror the display perfectly in space - same with Mac - start the virtual display by just copying the display and allowing controls to pop it out to enlarged, wide and super wide. With a display like an iMac or Studio Display, an option could be to add a display. Even an Apple TV virtual share would be a great option.
- audio mixer to allow more than one source of audio and make the audio spatial so that it comes from the different part of this space. The Quest 3 does this.
- share the Mac audio into the headset when using the Mac Virtual display
- allow the use of a mouse as well as a trackpad. Eye tracking flips the focus in 3D space and the mouse controls the pointer in this window. It works perfectly on the Quest and I suspect it’s a cynical hardware constraint to get people to but trackpads.
- room workspace profiles - allow the launching of profiles of windows that can be launched perfectly aligned in a given space. So, an office profile has multiple windows of apps and a living room may have a cinema setup.
- of course the big one should be Apple Intelligence features
- native ability to run a spatial persona through as a webcam to the Mac - there is an app that does this. Really they need a Spatial Persona API
- better aware of context switching while editing text and looking other windows so it’s not so clunky
- pencil 3 pairing to allow drawing or writing on any surface including and virtual whiteboard laid on a plain wall, a notepad etc.
- ability to scan a paper document instantly and bring it into the spatial environment as a PDF - this should be pretty easy given the number of cameras.
@@NeilLavitt great list of possibilities and I really do hope Apple listens, there is plenty of time and now with VisionOS 2 scheduled to launch on 16th Sept, we now have a loads of opportunities to get these points across. Thank you for taking the time to put this out there.
@@AVisionexperiment I hadn’t notthjs before but in their Vision OS release announcement, BT Mouse support! Justified.
Great list
virtual acade controller for gaming, that u can custom the size and mapping
@@BillyLeigh would be cool but I’m not sure it would be very accurate. But who knows what Apple could pull off.
They probably will release the bigger virtual screen at the Mac event in October.
if your brain can handle it,. also look into the TAP Strap 2 or TAP XR devices once nice bit is you can use it to type on your leg while standing even its a one handed gesture based device
@@cjadams7434 sounds interesting for sure. Thank you
good quality video
@@许浩川 thank you I try my best to offer quality, always want it to be very watchable.
I enjoy watching a person in “goggles” talk to me. Please wear your AVP more often in videos. Thanks.
Think of an apple pencil magnet dock on the side of the headset…
@@cjadams7434 I’m sure Apple could find an elegant design for that. Would be pretty cool
Apple has you ..
@@SWGOHWARS-66 has me what? 😂🤷♂️
QR code recognition via pupil tracking will be huge
@@LeftBoot that would be good but I can’t see Apple opening up the cameras yet because of people abusing it in some way.
@@AVisionexperiment not with AI
@@LeftBoot it’s more about apples stance on privacy. AI won’t fill people with confidence on privacy just yet.
@@AVisionexperiment what do you mean by 'open up the cameras'? Augmented devs don't get access to the cameras?
@@LeftBoot what’s gonna scan the QR code if it’s not the cameras on the device.
visionOS 2 is coming
@@無敵な飛王可愛い it’s here! I love it
More imerseve material and 4K per eye, as they promised.
@@pep98 hopefully we will see that as we move into 2025🤞🤞🤞
I do not enjoy watching a person in goggles talk to me. If you are going to address the camera directly, remove the device. If you are demonstrating a specific feature that requires you to wear the AVP, then put it on. Sorry to be critical, but I thought you should know.
You make some good points here. I'm curious to see how this product grows over time. I'm currently deciding if I want to wear my AVP for the keynote.
@@stevedorsey1269 if you watched my last video, you’ll understand that my script is in the headset in notes so it’s one of the main reasons for keeping the headset on, I would like a hack that allows my persona eyes to show all the time so it looks like I’m engaging with the camera.
I will do content without it on, but more content that’s not set out or scripted.
Thanks for watching and commenting.
Isn’t it great that you have a choice. It’s a channel about the Vision Pro so for me, if doesn’t matter.
@@NeilLavitt 🫶
, I think that’s why many use video overlays while you talk in the background. You do a bit of this, maybe more?
@@NeilLavitt if there are overlays to use I will definitely use them, some things that don’t exist are hard to create. Maybe I need to look at some quality AI image generators to show my “visions” no pun intended