Hi all,
I was playing with a Google Pixel phone the other day and learnt that there were on-device image descriptions that worked amazingly. You swipe down and to the right, like an L if you have read print before, and it's in the TalkBack menu after selecting it in the settings.
I was honestly blown away, looking over to my boyfriend and was like "Holy sh**."
I know Be My AI exists and it is absolutely amazing and an app I use frequently, but this was so cSo I'm posing this question to everyone on the forum - do you think there's any chance of on-board AI image descriptions in iOS18 or any future iOS updates? It would be ridiculous to wait when it's already on Google Pixels (and other Android phones when you can install Google TalkBack).
Comments
Possible
It's possible at some point, though I think Apple will have to be running some kind of super advanced on-device model. Apple Intelligence is just getting started, so we'll find out what 2025 and beyond has to offer. I particularly hope screen recognition and other features come to macOS, as that's the only Apple product I may be interested in using again in the future.
I have a Pixel 7, so I have to use the online Gemini image descriptions. From what I heard on Blind Android Users, they're better and faster to generate, so I'm not missing out on much.
iPhone 16 visual intelligence
Something I saw coming later this year was visual intelligence to the iPhone 16. There might be a possibility of something like that being used for image descriptions. Would be really nice.
Is it on device with google…
Is it on device with google or is there some cloud work going on? Even with 8 gb ram and improved processing the quality of output is going to be limited.
This is great that talkback does it. I suggest we request that there is something similar when apple's screen awareness comes out next year. I imagine we could ask it for descriptions, but there should be a rotor option too. This will be using apple's own servers, so will, again, be off device, or some of it will be. Think it's only the light AI that will be on device, heavy lifting elsewhere.
Thoughts
Ollie, I don't think it's on-device, so that's my mistake.
Chris, I wonder if it's possible to assign a gesture to get image descriptions - that would be very, very useful.
I'm currently reading an article about it now and it has its shortcomings, but it's only the first generation. This makes me so tempted to switch, but I just upgraded my Apple Watch not too long ago, Find My is extremely useful, and my friends love iMessage for our group chat.
You can add a gesture to image descriptions on iOS
I don’t know much about talkback, but from the little bit of android I’ve used, I don’t think you can add a gesture for image descriptions., You can set a gesture to describe image on iOS, which I have as a three finger single tap. That will describe an image no matter where the voice over cursor is located.
Hmm
I do have 'Describe Images on for... (insert app here)'. It seems to work okay, I just wish there was a way to use gestures to get an AI image description sometimes. Like, it would be information overload if that was the default option, but it would be cool to say, 4-finger double-tap to get a detailed description.