February 2 is Groundhog Day, which also means one of my favorite movies gets replayed on TV. This year though, I almost forgot about that because I was geeking out because Apple's Vision Pro headset got released. Since then, I wondered how long I could wait before caving to at least check it out even though I have no intention of buying one. A couple days later Apple began letting people reserve demos in their retail stores, so I decided to when I found out that my local store had a surprising number of open spots available. I did the same thing when the Apple Watch came out in 2015.
I got there before the store opened and had plenty of time to check out the AVP headsets on display before my appointment time.
Physical description
The AVP is actually smaller than I thought it would be. It really does look a lot like a heavy pair of ski or scuba goggles, but crossed with a pair of AirPods Max. The front of the headset is all curved glass which surprised me; no camera bumps anywhere, so you can't feel where any of the camera lenses are. On the top right side is the same digital crown from the AirPods Max, and on the top left is a long button, like the transparency button on the AirPods Max, except here it takes pictures and video, and I think I had to triple click to turn on VoiceOver.
The padded cushion that touches your face feels kind of like leather though I don't think it is. On the inside you can feel the lenses that sit in front of your eyes, kind of like the ones on glasses. The cushion and the light seal attach to the headset magnetically in the same way the ear cushions do on the AirPods Max. In fact, that's the way I would describe what the AVP feels like: AirPods Max for your eyes. The external battery is about the size of an iPhone Pro Max and a little bit thicker. It's just a big aluminum rectangle with an Apple logo in the middle, with a USB C port on the bottom and the braded cable perminently attached.
Initial setup
Once my appointment came up, they took me to a table and used an iPhone to scan my face for fitting. It was a lot like setting up Face ID, but I actually liked it better because you don't move your face in a circle. The guy had me look up and down and side to side several times, which was probably to ensure the best possible fit. I think he told me I used the smallest light seals and cushions they have.
A few minutes later, the guy brought out the Vision Pro on a wooden tray. He explained everything about it, including the proper way to pick it up and put it on. If you pick it up wrong the light seal will easily detach and your expensive headset will go crashing down. They tell you to pick it up with a thumb under where the headset goes over the bridge of your nose.
They only let me try the nit headband. It's kind of a stretchy fabric with a dial on it to tighten it. I had to make it was tight as it would go. The headband kept wanting to go behind my ears like a covid mask, which I quickly hated. I figured out that I could barely fit it over the top of my ears though which was at least a little more tolerable. I agree with a lot of mainstream sentament though; the headset is very front heavy and I did have to take it off a couple times during my demo time.
Setting up VoiceOver was actually a lot more frustrating than I thought. When you first turn on accessibility settings, it has you turn the digital crown to select the option you want and press to select. This was fine, except when I tried to get into VO settings to play with the gestures, it kept kicking me back out to the main accessibility set up screen. When I actively tried to look straight ahead so the specialist could see what I was looking at on his iPad, the same set up screen would pop up again.
Getting around the vision Pro
I had read up on the VO gestures before hand so I wouldn't have to spend as much time trying to figure out how to use the screen reader during my limitted time. Some of the default gestures are kind of backwards to me. I found myself doing the equivalent of flicking in the wrong direction because I was pinching the wrong finger with my thumb. The gestures can be customized though.
When the VO gestures worked correctly, it was awesome! It felt totally natural zipping around apps. If I could have an Apple watch control my iPhone like that, I would get one.The sound is also really good, which I didn't expect coming from the little audio pods on the sides of the headset. VO sounds are even spatial, so as you move between apps on the home screen, for instance, the VO sounds pan from left to right when you get to apps on that side of your view. I thought this was a neat touch.
But what I thought would happen is exactly what ultimately did. The AVP kept interpreting the slightest hand movements, and probably eye movements too, as input gestures, resulting in a lot of pop up dialogs, and even switching to different apps, that I wondered where they came from. Even when I turned off the eye tracking in VO settings, I kept getting a lot of spurious input.
Minimal app experiences
Normally the AVP demo is pretty guided, and they have people go through a series of apps to see what they think. They don't just give you a free for all during your time. Since I was using VO though, they let me try apps that I thought would be useful to me. I played with the music app, and was actually impressed with the sound the AVP puts out. To me, it sounded like a good pair of headphones, though to be fair I didn't spend a lot of time listening to anything because I didn't think to look up anything I was familiar with.
Then they had me check out the immersive video demo that they have in the TV app. Although it does have AD, it did nothing for me. It would've been exactly the same as watching a demo video from one of Apple's keynotes on my phone with a good set of headphones. That's kind of to be expected though, since the immersive stuff is totally visual.
But I must have had to restart that video about 10 times because I either couldn't get the language menu to come up so I could turn on the AD track, or if I did, when I tried to go back to the video player controls to start playing it again, I would get moved somewhere else and had to select the video all over again. This got frustrating really fast since I couldn't really do more than the equivalent of flicking through each element on the virtual screen.
That's what made me take the AVP off in frustration even though I got to play with it for an hour. Most of my time was figuring out how to navigate with VO. Every time I thought I was getting somewhere, a random dialog would come up or it would think I was trying to open the control center, or I'd end up on the status bar, or who knows where else. It was pretty disorienting. VO also uses the image descriptions feature to try to describe what the camera sees in passthrough mode. It kept telling me "four people in front of you" when I would try to look straight ahead. This was cool and a good consideration, but it got annoying quickly, much like when VO is constantly telling you when focus changes in the camera app on your iPhone. I'm sure this can be turned off though.
Final thoughts
I already knew I wouldn't buy this yet even if I could afford one. I'm not sure how much value a blind person would get out of this right now. Just like I thought, using the Vision Pro with VoiceOver takes away all tactility. There is an equivalent to exploring your iPhone stcreen by touch, where you can tap and hold a finger and your thumb together, and as long as you hold them there, VO will speak what you are looking at. I found this way too sensative though. Perhaps it would get easier with time. At least on your iPhone you're dragging your finger around a flat sheet of glass, so you have at least some tactile idea of where things are in relation to each other. With AVP that's all just in virtual space around you.
Comments
This doesn't surprise me.
Honestly, i'll probbaly wait for something else to come out and buy that.
I like the idea of the AVP but the voiceover gestures seam clunky, although they've probably worked with what they have, and at the moment I just can't see the appeal.
Also, wow, that's some very delicate tech,, I'd not be surprised if people start complaining how easy it is to break the AVP.
I mentioned in another post that I'd probably buy the next gen or the one after that but honestly now I've read a bit more, nah, I'd not mind playing with it but I can't see much here for a voiceover user, this is a very visual experience.
The only thing I can think of at the moment is that voiceover might get better at guiding you around places and if that's true I'll look into it but for now i must say I'll be giving it a miss.
this sounds interesting to…
this sounds interesting to read for curiosity purposes. however yep not much for us.
and oh ofc they'll be more costly here in my country even for a lot of nondisabled people.
it's called "vision pro"
i mean, it's always good to be curious and optimistic, but really, how much a blind person can expect from a device called "vision pro"? it would do magic for one's vision, something a blond person doesn't have.
@LaBoheme.
I completely agree. It will be interesting to see where this goes in the future but I don't think it will really be for us to be honest.
I might change my mind when applevis adds a apple vision pro headset link :)
Vission pro
To make it work, 1, battery life must be longer. 2 accessible apps such as be my AI be able to use the camera. 3, price must be better.
May become useful after more software and hardware revisions
I could see voiceover gestures being tweaked as new software patches are released. If memory serves dragging and dropping apps on IOS didn't work for us at the initial introduction of voiceover, so this could be more of the same. Usability enhancements will trickle in over time. Having said this, I could see how owning a pair of these might eventually be worthwhile. Just imagine an app like OKO indicating traffic lights and using spatial audio to indicate whether you were veering to the left or right during the crossing. Imagine voiceover's real-time image description merging with siri so you could ask questions and get detailed answers live. Not saying any of this will happen with the initial hardware but I predict a fair number of us will have these in a few years.
Possibly.
I could see it happening. Let's enjoy the journey together and see where it goes.
First vision pro is what I expected for a first gen
I didn't expect the Vision Pro to really have enough accessibility built in. I would never trust a headset technologically advanced to guide me across a street. that's what mobility is for. I'm a visual person so if the price came down, accessibility improved, yeah I might consider it. I mean how would it be to watch one of my favorite bands on stage as they describe the lead singer's outfits, her bandmates costumes, if she's laughing and making cute gestures to the gentleman she's hanging with at the moment? For now, the headset has so much more advancing to do. Here's hoping the patches improve accessibility for not just us, but I'd love to see how someone in a wheelchair might use it, seeing what challenges they need to face. Here's to the next update soon.
Not very surprising
Considering the way, the description came across to me most of your gestures would be controlled. Through eye movements, most blind people would probably have an extremely hard time even to get this particular device to use. The only thing that I could see useful about this would be using it with Ira! That would at least give us as blind people wearable camera that would be at least somewhat more in the price range that we could use more effectively with that service.
Problems with this post
So some of the things you talked about could be useful. Description of what the product feels like in the hand, what it’s like to put it on, how would fit on your face and head, the experience with the Apple representatives, what they tried to show you. But it sounds more like going there and then not really being prepared for you whatsoever.
At this juncture, are there even apps for division Pro? At least ones that are specifically tailored to the visually impaired? It’s only been like five days. The other issue that I have is, do you have any mobility issues? Do you have a hard time using your hands in any way? How much site do you have? Your Use case is going to be very specific to who you are and it might be very different to who I am. He spent what, 60 minutes with the device? Is that actually enough time to get a handle on how the device is actually used? And you used it in an environment that is usually very crowded.
You did the best you could with what you had. But this isn’t really about what a blind person is going to experience at an Apple Store. This is more about how a blind person might adapt to this new technology. And that’s going to require some practice. And you’re not gonna get practice unless you’re using it at home.
Ultimately, the result is true. This particular device isn’t necessarily focused on the blind community. At least not yet. But how you got there is problematic. I think this post needs caveat. More of a description of who you are and how you use your devices. Helping people understand that you’re experience was specifically at the Apple Store and doesn’t necessarily represent what it’s going to be like to actually use the device in real world situations.
The experience
The demo experience sounds interesting however I am not inclined to purchase a vision pro. There is potential in the future but now it is for the sighted world. I would also become easily frustrated and the device would interpret my eye movements as gestures. Also, the concept of operating a virtual screen with voiceover is a little over my head.
Dude!
"It's perplexing to me, as a totally blind person, that why individuals who are blind expect vision technology to be accessible to them, especially when the device in question is explicitly designed for those with sight. The product is named 'VISION Pro,' clearly indicating it is intended for people who are sighted or have partial vision. It's quite surprising to see people visiting an Apple Store and engaging with a device that is, by its very name, targeted at those with the ability to see. I'm not advocating for discouragement, but let's be realistic."
roman
That is not a useful comment because Apple has repeatedly spoken out about making all their products accessible to everyone. Similarly, they use the Blind for marketing on a regular basis. So, everyone’s expectation that this will be accessible is totally justified. Whether or not you understand, it isn’t really relevant to the fact that people are excited about a new technology and don’t want to be left behind
@roman
Why do you use a touch screen smartphone then? Isn't a phone made out of entirely glass surface, and which has it's touch display as sole means of it's input, also primererily made for sighted people?
Also, as a side note: the OP's idea of using Apple watch to capture hand jestures to controle IPhone is amazing. Someone at Apple, please take note.
Nope
Apple completely lost all of my interest with the part that you have to pick it up in a certain and delicate way. Don't want anything to do with it, and the only thing I might be interested in, relating to this type of wearable product, is if Apple makes Facetime on iOS able to access an external camera so as to use an off-the-shelf pair of camera glasses; preferably not an overpriced, Apple brand product.
I agree with old bear.
It's so delicate. Hopefully that changes in the future but honestly, i doubt it.
Apple love there thin designs so there's that.
I also agree with @LaBoheme. If this device becomes useful to us in the future, great! But if blind people bought it now; it'd just be a fancy apare of glasses with voiceover built in, nothing special really.
Also for that price if an app was developed to help us get around, it better be amazing and beat all the blind apps out there.
80 percent vision, 20 percent VoiceOver
Yeah, that's what it sounds like. Is it worth $3500 US? Not to me. You're paying for the advanced images, not anything else IMHO.
if something has a screenreader, then it means that they try
if something has a screenreader, then it means that they try to make it accessible.
they wouldn't bother other vise.
SeasonKing
"Oh, what's this? Have we stumbled upon a groundbreaking revelation, or are you merely showcasing an unparalleled level of ignorance? Let's not forget that a smart-phone, shockingly, is still a tactile object. Yes, we can actually touch its screen, develop what's known as muscle memory, and, lo and behold, VoiceOver can even read items aloud for us to navigate. Revolutionary, isn't it?
As for the Vision Pro, oh, it's a marvel of modern technology, completely virtual, requiring the miraculous act of seeing with one's eyes — a concept clearly alien to our community. Why do you think we speak of 'totally blind individuals'? Not for the fun of it, I assure you.
Ash Rein
"I understand the feeling of being left behind, yet it's crucial that we manage our expectations, especially regarding products designed primarily for those who are sighted or nearly sighted. While I acknowledge that Apple markets their products with us in mind, we shouldn't hold high expectations for this particular offering, as it's not intended for our use. If you decide to purchase it, by all means, go ahead; however, I will maintain my stance regardless of whether or not you agree with me."
Sight is not everything?
Is all. All universal products that focus on tech is for the sighted people. They add accessibility when they remember or someone force them. Vission pro for now is not for the blind people. 3 years from now? The shadow knows.
could be good with some tweaks
Hello,
The fact that Brian could use the image descriptions feature with VoiceOver proves this device could be useable with some major tweaks. You could use it with Be My AI or the built-in cameras to describe your surroundings for example. I wouldn't buy it in its current state though. Too expensive, and the device is far too sensitive to get anything productive done with VoiceOver. I think trying out the device for an hour is absolutely enough time to get the gist of it. I tried out the iPhone for the first time in 2013, and after about five minutes I knew it was for me. I wouldn't rule out getting this device completely if the price comes down and there is a decrease in size and weight. Imagine walking down the street and having stuff just described, or going to an event and not having to ask someone constantly what's going on. There are apps like VoiceVista and door detection if you've got lidar, but you still have to take your phone out of your pocket and point it at objects. With the Vision Pro you wouldn't have to do this. You could just keep walking.
Tara
Nice but the Cubs have a better chance of winning the world series this year or next.
@roman
Apple vision pro might provide you feedback in auditory queues, when you make those jestures. May be in future, it can also provide haptick feedback on my temple or, the entire cercomference of my head, giving us an extra input. What it might use it for is anyone's guess.
I lissen to books/documentories on my phone while having food , and find it very inconveniant to touch display or my earphone's touch surface. With a device like Vision pro, may be that problem can go away.
I for 1 not going to judge a product category by it's version 1. Gonna give it a fair chance to evolve, and keep an eye on the things happening in this space.
I've never liked the glass display, it's fragile, clunky, and down right inconveniant when it comes to performing complicated 4/3 finger jestures. And don't even get me started with that z with 2 finger jesture.
Stick to your glass slab as long as you want. I don't care.
more thoughts
Some more things I've thought of since my demo of the AVP the other day.
Yes, you would get a much better idea of if a product works for you once you get your own and start integrating it into your workflow. I tried several times to use a Mac with VO on campus in college back in the leopard days. Once I got my own MacBook in mid 2010, things started to make more sense. When I first played with an iPhone back when the 3GS was new, the interface seemed a bit complicated, though to be fair I only played with it for about five minutes. When I got an iPod touch at the end of 2009, it took me about half a day before navigating the touch screen clicked.
But how many people are going to drop a minimum of $3500 to play with a device that is primarily focused on visual interface enhancements right now? I already knew the AVP wasn't something I would buy, but I thought it would be interesting to see how they implemented VoiceOver.
I am a total and I don't have prosthetic eyes. I also don't have any fine motor issues. I looked at the AVP user guide again later, and there's a setting in accessibility for eye input, but it doesn't sound like you can just turn that off directly. It just lets you choose from one or both eyes to track for input. Apple says that you can choose another input method elsewhere in accessibility settings if you can't use your eyes, like pointing with your hand. I didn't know this during the demo, and neither did the specialists I was working with. I tried to learn as much about the VO gestures as I could ahead of time, because I already knew I'd get the flummuxed response from them when I told them I was using VoiceOver. They were all like "this will be a learning experience for us too. Even doing the little explore preview on the iPad Pros they have for people just walking by was accessible, with well-described pictures, but I didn't have time to turn VO off when they took me to the demo area, so I had to turn VO off on the iPad i'd been using at the end because the specialist didn't know how.
Right now, it seems like the two main use cases for the AVP are having huge virtual iPad or Mac screens someone can put wherever they want for more comfortable viewing, and content consumption -- neither of which I think have much value to a totally blind person. I did try one of the virtual environments you can place windows in, but the one I picked didn't have any spatial audio with it. the guy just said an immersive immage of white sand got placed in front of me. I suppose this is a little bit like the background sounds you can now enable on your iPhone if you need that to help you focus.
The immersive video is also obviously totally visual. The AD doesn't convey any info about the immersiveness of the video. There was a clip at the end of a soccor player scoring a goal, but all the AD said was "a soccor player scores a goal in a sadium packed with fans." I read in a review later that the clip was taken from a view above the net, which the AD track didn't mention at all. To be fair, there's no way AD could convey that much info in such a short time. It just kind of emphasized how much more information you can get so much faster visually than through audio. A picture is worth a thousand words, as the saying goes.
I did get an hour to try out the AVP, which is double the length of the normal demo. I suppose they did that since they knew I would be using it differently and there were no appointments after mine. They also encouraged me to come back another day if I wanted, but I don't think I will. Since I got there right when the store opened, it was not crowded at all.
The AVP, right now does not make me feel left out from a part of the world, the way that something like all the new video games do. The name alone does tell you what its primary focus is and I know that. I am glad that Apple included VoiceOver though, because then I can choose whether or not it's a product worth using. not someone else deciding I can't because VO would be too hard to adapt to a new device like this and why would a blind person use this? How many of us have had people be surprised when they find out that we actually do watch TV?
The obvious use case for us is as some sort of augmentative orientation aid, but that would take something smaller more like glasses. People in the mainstream also seem to want a form factor like that, but there is debate if tech will ever get there.
An anticipated dread
When they started coming out with software for people to vocally control computers and dictate, rather than type on a keyboard, sighted people kept telling me that I needed to use that because I'm totally blind, and rarely could grasp the concept of a screen reader conveying to me what would be on the screen. I worry the conceptual wires are going to get crossed in the same way with this type of device. I don't want to have to try to explain to sighted people why I have no need for screens to be in front of my eyes. I mean, after the thirtieth or so time, I probably start sounding grouchy.
@Holger Fiallo
A couple of posters have made some good points here, but you seem to make it your duty to respond with sarcasm or negativity to these points. Why is this? It's true that vision isn't everything, and not every device that's out there is made for blind or visually impaired people. We live in a sighted world. The best thing we can do is adapt to that reality.
With all that being said, would I purchase the AVP upon final release? No, I would not, at least not until the price drops to a more affordable amount and the platform has been expanded with apps that are not only made for the platform, but apps that work with the accessibility features on the device. Apple's doing something different here and it's going to take time for it to really take off. think of that, if you will. Good day
Same sentiments exactly
I went in for a demo and had the same exact experience. My guide didn’t really know what she was getting into with voiceover so she had me skip the voiceover tutorial on the set up screen, which I think is not a good idea. If you do it in the set up screen, I’m guessing you would have less opportunity to switch between apps and open control center and all that since nothing is set up yet, that way you can go through and practice all the gestures before, adding all the complexity of actually navigating. The gestures were definitely all over the place and she could see that I was doing them correctly.
It worked a little bit better when I was being very robotic with my movements, and trying not to move my hands in any other way but of course that’s not how they intend you to use it. I could definitely see a lot of useful navigation functions, especially ported over from the Magnifier app which concurrently describe people in text and all that, but it would be great to be able to turn people, announcements off and control center or something or at least make it more clear how to do if it’s already available.
Update: One more session later with a very Patient employee, I was able to do more of the tutorial and get the hang of some basic things like how to explore the screen spatially like when you drag one finger around the iPhone screen to get the hang of the layout. That helped a lot because since I didn't want to take the time to learn all the advanced features, I could still get from one part of the app to another relatively quickly, rather than scanning all the way from top left, down the sidebar, and to the right. There was still some misfiring just because I was probably doing some unknown gesture without realising it but I was able to actually do some of the actual demo this time. I'm surprised braille input wasn't one of the official typing methods in the tutorial, seeing as it's so hand focused, but they did have 3 (not very functional) typing methods. I also do have some vision, so using my head to explore spatially was an option that I could use intentionally. I think for someone who is for whatever reason, considering getting this device, it’s possible to navigate with voiceover, and I don’t think it’s buggy, I just think it takes a lot more patience to learn the device than the half hour demo session.
The blind life
There is a video in Youtube on VO with it. Showing how it works.
I think it just has a bunch of things I don't need.
Thanks for the description. From what I understand, the Vision Pro is for seeing the world around, you and also for providing visual information. For me, the interesting thing is the idea of a spacial camera, and getting audio feedback. I think I'll pay more attention to smart glasses, rather than this thing. Hopefully, smart glasses can evolve to give real time information, and could even do the hand gesture thing. I don't think I would ever count on just it for orientation though.
My thoughts on the vision Pro thus far.
i'm honestly confused on how some people have no idea what they are talking about.
It's a completely new product that was launched a few weeks ago and a completely new concept for us to grasp because currently existing smart glasses like the holo lense, Quest 3 and so on don't even have any form of screen reader accessibility. And, it's called Pro for a reason, not for the regular consumer who already has problems finding his apps on the iPhone, it's pro for a reason.
So of course we have to get used to how the device works. Why do you expect that there are already apps tailored to us if it is not even clear where the device will go. Remember the apple watch, apple tryed marketing the device as a fashion accessory, only later on the direction of a health monitor and activity tracker came to mind and that's when sales picked up and the device was to be found everywhere.
Give the Vision Pro some time with the next 2 or 3 generations, which I guarantee you apple already has somewhere on the design board or pipeline.
And, I know that I am getting into hot waters here with a good chance of beeing burned, but we all have different skill levels and some of us are not as tech literate as others or have problems grasping visual concepts, those who still have some sight or have seen something before they went blind have a major advantage, if the device is a breakthrough, we will eventuelly get used to it, or at least some of us will, there are always those who rather use flip phones and never got out of their casette player days.
V pro
I heard that those who got one are returning them. If so, what does it says about the V Pro?
I do not believe...
Just because people are returning a product doesn't mean the product in and of itself is bad. That is all.
Joseph
FYI. I just stated people are returning them. Reasons? Who knows. Maybe it was not for them, maybe the novelty went out, maybe they figure out that it did not work for what they needed it.
Shame on you, Oliver!
You should know that "Pro" something or another means that mine is better than yours, assuming I have one. And I am obligated to make sure everyone who doesn't have a pro whatever knows that I have one.
As for the rest... There was that so called love scene in the movie "Demolition Man" back in 1993... That's what this device will be used for, assuming you pick it up properly and it doesn't fall apart.
Re: Demolition Man
No, OldBear, just. . . no!
I read something on youtube.
According to what I read, apple isn't allowing devs to use the AVP outside of narrow guidelines for now, that means no object detection and all that.
It might change in the future but that's what I read.
Would love to have it read what I am looking at
On the Mac, in Accessibility, spoken content, read what is under the pointer, is awesome for those of us with some residual vision. ON the iphone/Ipad, you touch the screen and it reads what is under your finger. The vision pro is really geared for sighted people, but has ENORMOUS potential for those of us with a little vision. I believe it should be fairly easy for Apple to have the AVP "read" what we look at, thither that highlighting text to have it read.