Hi all!
Lately I've been wondering whether upgrading my iPhone SE 3 with single rear camera to 15/16/17 line would result in better Voice Over live recognition. Even though I use standard Camera app to recognise text on various items like food packages and bottles, because live recognition doesn't support my native language as yet.
The question is,Would upgrading the phone for a better camera help, or is recognition quality more of a software thing?
Would be glad to hear your thoughts
By Felix, 15 October, 2025
Forum
iOS and iPadOS
Comments
Perhaps
Depend on the app. If you check the so call descriptive image by apple does not do a good job. Is more about software than hardware.You could have 17 pro max but if the app is crap, you get crap. Long live cats.
Recognition doesn't use multiple Cameras
Seeing Ai, BME, all apps use only primary camera as far as I am aware.
As someone above said, The actual improvement in the senser, processing etc matters allot.
I invested in a pro Max model only once, and I know it's bound to serve me well for at least next 5 years without any trouble, possiblly more.
Newer iPhone
I think the camera on any iPhone is good enough for text recognition. You may experience an improvement with newer recognition models in VoiceOver and possibly Apple AI intelligence with an iPhone that can run those models, such as an iPhone 17 series or 16e. That would be the biggest upgrade I think you'd notice moving from an SE.
You have all the camera you need
There might be some AI advantage in the processor chips of the new phones, and I can't speak to that, but the camera in the SE3 is completely fine for recognizing text in a normal situation. The only physical/optical advantage you could get from the new cameras on some of the models is if you wanted to read fine print from across the room. They have only made the sensors larger in the number of pixels they can record, and how much focal distance they can fold up inside of prisms. If you hold the phone over something with normal--not microscopic--print, the optic and sensor differences would be irrelevant in that situation.
Wondering the Same Thing...
I have to agree that it depends on the app in use. However, I'm wondering if that is the only factor taken into consideration. I am basing this solely on my personal experience. I started out on an iPhone 7 and am now the proud owner of an iPhone 14. It seems to me that VoiceOver speech has gotten quite good at describing various things, even when not in one of the OCR apps. I've yet to try this out with my eReader, but that would be interesting. As I was starting my laundry a couple days ago just prior to my exercise class, I opened up the dedicated app for my hearing aids. This particular app was previously rather inaccessible with VoiceOver, and my audiologist and parents weren't sure I'd have a good enough experience as a hearing-aid user. I don't think they're aware just how accessible the settings app on the iPhone is. However, when I checked out the hearing-aid app on Monday VoiceOver read out more of it. So this is why I'm wondering if a number of factors are coming into play here. Long live cats, and Apple!
Ekaj
Is all about the software. The hardware could be the best but if the app is not, you have nothing. Looks promising but time will tell. Long live cats and not apple.